Ask most fans of “Western-style” RPG video games who the top developers are and they will probably give you, in some order: Bethesda, BioWare, and Blizzard. Throw out Blizzard because they make primarily action-RPGs, and you have Bethesda and BioWare constantly releasing a stream of high-quality, well-reviewed hits. However, the ‘B’ at the beginning of the names and the genre of video games they share are the only things these two companies have in common. Their games are as different as night and day.
If BioWare games are of the “Choose your own adventure” style, then Bethesda games are closer to MadLibs. BioWare’s games are heavy on narrative. You play each “chapter” in the story and, at the end of the chapter, you can pick where to go next and your choices have some effect on the story, but you still get to the end no matter what. Along the way you can pick up side quests to help fill in the spaces of your adventure, but the focus is always on playing out the main story.
Bethesda games throw narrative out the window. Each Bethesda game consists of multiple short stories that are so separate they aren’t even on a first-name basis with each other. Beyond those are even shorter stories that you won’t discover unless you just go poking your nose in where it doesn’t belong, which is sort of the point of a Bethesda game. Bethesda games are open-world games because that’s all they really are. I know that sounds obvious, but you can’t produce linear “chapters” like BioWare does unless you have, you know, actual chapters. In a story.
Now, there are good and bad points for each type of game. I happen to love both. But, based on sales figures, gamers love Bethesda’s style more than BioWare’s. So, somebody at BioWare (or at BW’s parent company, Electronic Arts) looked at the sales figures and decided what BioWare needs to sell more games is to make an open world!
:sigh: And so we get Dragon Age: Inquisition. A game that’s BioWare through-and-through—after all, they only know how to make narrative-heavy games—but has such huge “levels” with so much filler, the narrative gets snowed under rather quickly. Peel back the layers and there’s BioWare imprinted everywhere on this game.
Meaningless step-n-fetch quests? Got those. Mini-games? Yep, at least two (there may be more I haven’t discovered yet). Crafting that does you absolutely no good because you loot better stuff from your enemies? Uh huh. Items that don’t drop from enemies until you actually get the quest to collect them? Please, don’t mention it again lest my head explode.
We fans put up with this junk because of the good stuff BioWare throws into their games. Lots of cutscenes with fantastic dialogue. Meaningful and deep relationships. Lots of character (and lots of characters—double meaning intended). Those are all in DAI as well, but it pales with all the not-so-good junk BioWare shovels in to make this an “open-world” game.
What sets Bethesda games apart—and, apparently, attracts more players—is not the open world, it’s the fact there IS NO narrative. The only “motivation” for poking around is poking around. You can’t just take the open world concept (which BioWare didn’t even really do properly), and shoehorn epic narrative into it. You end up with lots of empty space when you do that, so BioWare filled the empty space with junk.
Lots and lots of junk. I’m only a dozen hours into the game and my quest log looks like the punch list for the healthcare.gov Web site designer.
It’s a slog, but I’ll keep slogging away. The narrative requires it.
 Don’t be fooled. DAI is NOT an “open world” game. It’s a game with individual levels. It’s just each level is huge. Sort of like the planet exploration from the first Mass Effect game. But with more mind-numbing junk thrown around the open space.
This is the first of two articles about PC gaming. In this episode, I’m going to cover the advantages and disadvantages of PC gaming (and, yes, there are disadvantages). In the sequel, I will provide a realistic PC gaming rig build for under $1,000. I’m not going to try to artificially hit some lowball figure by leaving out parts or dumbing down components.
First, a word from our sponsor (me)…
I’m a Gamer
I’ve been playing games since I was a teen, which means my first shiny new console was an Atari 2600. In the late 80s I got an NES. I followed that up with a Super NES. In the mid-90s I purchased my first “modern” PC and started playing games on that. I was primarily a PC gamer from about 1996 through 2006, building my systems after my first one got old and tired. I skipped the N64/PS1 generation entirely and only got a Gamecube in the middle of that generation. I switched back to console gaming in January 2007 when I got a Wii. I upgraded to a PS3 in late 2008 and now have that PS3, a Vita, and a PS4. I still have a gaming PC, but it doesn’t get used much.
All of that provides my bona fides and my bias. I have “played both sides” of the debate. Right now I prefer console gaming, but I keep my PC upgraded enough that I can play the occasional indie game or older PC game. I buy the vast majority of new games for PS3 or PS4. I don’t dismiss PC gaming out-of-hand; but, I don’t think PC gaming is the Master Race. It’s just another option, one with advantages and disadvantages.
This is part two of a two-part post on PC gaming. In the first episode, I examined the advantages and disadvantages for PC gaming. In this sequel, I proffer a realistic gaming rig that you can build yourself. Note that putting together a PC from components for the first time can be a bit scary, but it’s not really that hard. All prices, unless otherwise noted, come from Amazon.com. You can generally get the same parts at the same price from Newegg.com, but all the Amazon parts are Prime, so…free shipping. Prices are current as of October 22, 2014.
First, this build is going to be a micro-ATX build. Micro-ATX computers are smaller, have lower power requirements, and some of the parts (case, motherboard) are cheaper. The tradeoff is less upgradeability and a tighter working compartment. I’ll note regular ATX options where applicable if you want to go that route. Second, this is going to be an inexpensive gaming system. I’m not going super-cheap by using inferior parts, which leads to a system that can’t play new AAA games. I’m also not going state-of-the-art (or even moderate state-of-the-art), which will keep costs down. This means you shouldn’t expect full 1080p/60fps graphics for the newest games.
There’s a humorous scene near the beginning of Mel Brook’s History of the World, Part I. Presented as the birth of the art critic, it depicts a caveman relieving himself on a cave painting. While clearly intended as a joke, it speaks to the conflicting nature of artistic critique and how that conflict is escalating in the Internet Age. The problem with critiques of anything, but especially art, is a disconnect between what a critic is able to offer and what audiences want.
A critical assessment of something necessarily includes the critic’s subjective impressions of that thing. Even for a consumer good, for example a car, includes the critic’s opinion on feel of the car, handling, ride comfort, placement of dashboard controls, etc. With art, subjectivity is really all the critic can report—there’s no objective way to experience artistic work.
Meanwhile, the audience for critiques increasingly demands objectivity, despite the fact a purely objective critique will be boring, and, in the case of art, impossible. This clamor is further complicated by the fact art has essentially become a consumer good. All consumers want to know is: is the product (even if the product is art) worth my money? But, no individual critic on Earth can answer that question “correctly” for everyone.
Since critics cannot “get it right” for everyone, everyone becomes a critic of the critics. This has been the case ever since “Letters to the Editor” first became a thing a few hundred years ago. Modern media accelerated the dissemination of “reader opinions.” I remember reading critical letters in nearly every issue of the comics I subscribed to as a teen.
Along comes the Internet, and not only is it that much easier to vent your spleen to the “rotten critics,” but everyone now has a platform to broadcast their opinions. Furthermore, the Internet is revealing a nasty undercurrent of psychopathic personalities who viciously attack other people rather than merely debating opinions. The result is a barrage of “opinion pieces” from every corner of the globe masquerading, in some cases, as critical reviews.
Real critiques are an in-depth examination of the subject. Artistic critiques, especially, should be about the reviewer’s experience. What does it mean? How does it make one feel? What questions does it raise? Why are these questions important? Even reviews of consumer product need that human element rather than a rote listing of “features” and “bugs” and “does it work.”
True critiques are diminishing as the rise of loud, often vulgar, and frequently hostile, “review” sites take over the Web and engage in abusive battles of words with their readers. And those types of “click-bait” writers are only increasing. In Pixar’s “The Incredibles,” villain Syndrome says, “When everyone’s super, no one will be.”
When everyone’s a critic…
 ROM: Spaceknight, Micronauts, and Star Wars in case you were wondering.
 For that kind of review, rating aggregators and e-tail “review” systems are a wonderful replacement, allowing, at a glance, what other consumers judge to be a product’s worth.
I’ve been playing video games for almost 35 years. I’ve been reading gaming media (magazines and then Internet sites) for about 25 years. The game press has always been an essential part of marketing games. Developers and publishers use the gaming press to hype games and spur sales. What’s so important about the gaming hype machine? Why is it so important to get those Day One sales?
The answer is simple: the game industry is broken.
In March 2013, Square Enix published a reboot of the popular Tomb Raider series. The new game, titled simply Tomb Raider, sold 3.4 million copies during its first month of release. Soon after, Square Enix announced it was “very disappointed” in those figures, revealing it had estimated TR would sell between five and six million during the first four weeks.
A few years ago, over three million copies sold would have been a success. (Popular 2003 game Knights of the Old Republic sold only 2.2 million.) Now it’s “very disappointing.” There is a fundamental disconnect between the cost to make AAA games and the price consumers pay. While the latter has been steadily rising (easily hitting nine figures for development plus marketing by some estimates), the cost of games has been declining once adjusted for inflation.
This wouldn’t be a problem if the market for games was expanding fast enough for volume of sales to keep up with cost of production, but it hasn’t. While the occasional huge hit comes along to keep companies afloat, the industry as a whole is sick—not dying—just weak and stuck in endless cycles of trying to get more money from gamers.
Day one DLC. Online passes. Pre-order bonuses. Collector’s editions. Micro-transactions. And, of course, endless hyping through the gaming press. Developers try to cut costs, overworking their employees and cranking out endless sequels that can re-use assets. There’s some good things happening in the indie space (and some not-so-good), but, much as I like indie games, I don’t want to lose AAA gaming, and that is what is slowly happening.
The gamers are out there. The Xbox 360 and PlayStation 3 have sold a combined 166 million consoles. If even half their sales are to individuals who own both, that’s still 120 million console gamers. Plus however many play on PC (Steam had 75 million users as of this past January). So why is it so hard to sell to just a fraction of that user base? Why are gamers not buying games?
Two reasons: games cost too much, and there are too many of them.
In the wake of Ubisoft not including playable female characters in either Assassin's Creed: Unity or Far Cry 4, the standard tropes are trotted out...the average age of a gamer is 31. Forty-eight percent of gamers are women. Etc.
These numbers are accurate, but are presented with no context. They come from industry information published by The Entertainment Software Association (1). They include ALL gamers. A 60-year old playing Candy Crush Saga counts as a gamer. The legions of middle-aged women playing Bejeweled Blitz count as gamers. When a parent buys a game for a child, the age/gender of the parent is counted, not the age/gender of the child.
Well, today I spent two hours standing in line with my two sons to play the demo of Super Smash Bros. Wii U. Without doing an actual hard count, the total people I saw standing in line were about 95% men between the ages of 15 and 25 (2). "Core" or "hardcore" gaming--the gaming dominated by the AAA games that cost $50 million to make and another $50 million to market--is overwhelmingly young and male. Citing statistics that count all the people playing free mobile games and the like is not going to change that.
And developers know it. They know the people most likely to slam down $60 for a AAA game on day of release are teen or young-20s men and the games are aimed squarely at them. If you're a woman or an older man and you want to see the industry change, there's a simple way to do it.
Spend money. Until you're spending money on hardcore gaming, hardcore games are going to keep dissing you.
(1) Link to current info. Another "fact" that's funny is "88% of games are not rated M." Watch an E3 press conference and it looks like 88% of games are rated M. (There were some trailers that were too disgusting for me. I can't imagine what the games will be like.) Just another example of how the ESA's statistics have no relevance to hardcore gaming.
(2) The actual percentage was probably higher than that; I'm being conservative. Within my immediate space in line there was one old gamer (me), one young gamer (my younger son), and one female gamer (someone in front of me whom I did not know). Everyone else was a teen or young adult male.