I was born in the Deep South to parents who had migrated from north of the Mason-Dixon. My interest in the debate over display of the Confederate battle flag is limited to its place as a relic of a seminal event in American history. I have no vested heritage tied up in that history. However, I’ve decided to weigh in on this debate because I see in it echoes of a greater ill that plagues society and reaches even into such cultural kerfuffle as #GamerGate.
Reminders of the Confederacy are everywhere. For the past 20+ years I have lived in the capital of the Confederacy. My kids go to schools named after Thomas J. “Stonewall” Jackson, Robert E. Lee, and Jefferson Davis. A downtown street is lined with huge statues of Confederate leaders. So, go ahead, get rid of the flag, that’s only the tip of the iceberg.
Confederates were also Americans, and that cannot be erased no matter how much we try. It is baked into our cultural memory because we are all descendants of both sides of the conflict. The Civil War was a struggle that exemplified a problem that continues to infect society today; and, no, I’m not talking about racism. Rather, it is the belief that people who disagree are better off dead.
In 1854, the US government created the states of Nebraska and Kansas. The Act called for popular vote in each new state to determine whether the state would permit slavery. Activists from outside the states immediately went to work, especially in Kansas. In May of 1856, Senator Charles Sumner of Massachusetts gave a blistering anti-slavery speech on the situation in Kansas, and delivered disgusting insults to a couple of Southern senators. House Representative Preston Brooks, cousin to one of the impugned senators, entered the Senate chamber three days later and beat Sumner half to death. Both Brooks and Sumner were praised as heroes. Two days later, John Brown led a group of men into a house in Pottawatomie Creek, KS and hacked five pro-slavery men to death.
The bloodshed had begun. It would end almost ten years and 650,000 lives later.
The Civil War began because one group of people decided another group of people should be put out of their misery. In the past 150 years, American culture—indeed, the culture in most countries around the world—has changed little. You can hear echoes of Civil War mentality in rhetoric surrounding race relations, gender identity, abortion, immigration, climate science, gun control, terrorism, religious freedom, healthcare, welfare…yeah, verily, even videogames.
Two thousand years ago a Jewish carpenter stood on a hillside and told an assembled crowd that just calling someone a fool was equal to murder. Today, every minute, thousands of these verbal murders are committed in the press, in blog posts, on social media, in private conversations. If we are ever to deal with these problems in a non-violent way, it will not be in removing a flag, or changing the name of a sports team, or putting the picture of a woman on our currency. It will be when we stop vilifying people who disagree with us and have honest, open, peaceful discussions about our disagreements.
No, the Confederate battle flag should not be flown over government buildings; it is a symbol of separation and segregation. But we can’t—and shouldn’t—bury it. So let us display it in appropriate venues, and when we see it, let us point to it and remind ourselves and our children this is what happens when you decide other people are better off dead. Then let us work toward living in peace.
 Florida; sort of the Cleveland Browns of the Confederacy.
I’ve written before how we often judge whether or not a decision was “good” or “bad” based on the results. A good decision can have bad results and a bad decision can have good results. The decision Pete Carroll made in Super Bowl XLIX to throw on second-and-goal has been widely excoriated as a really bad decision because it resulted in an interception that sealed New England’s victory. However, it was actually a good decision that had a really bad result because of poor play by Russell Wilson and good play by New England’s defensive backs.
First, let’s analyze the situation and the call. After a first-down run of 4 yards, the Seahawks had three plays to gain one yard. While that sounds simple “on paper,” it can be very difficult, especially at the one-yard line. Take a look at the sideline view…
Much has been made of Marshawn Lynch’s (Seattle’s running back) success on 3rd/4th downs with short yardage needed. But a running play at the goal line faces additional obstacles. On a short-yardage play at midfield, the defense still has to protect the field behind them in case of a breakthrough by the runner. At the goal line, the defense must only protect the line of scrimmage. All the open space of the end zone is irrelevant on a running play. This allows the defense to charge forward with abandon—if the runner gets past them, it’s over. So a running play is not a “gimme” at this point on the field.
The Charlie Hebdo attack in Paris on January 7 brings into focus a common theme pervading our culture…the “right” to not be offended. The jihadists who gunned down a dozen people for publishing offensive cartoons may have used the most extreme measure to intimidate, but the impetus—“thou shalt not offend me”—infects almost every line of discourse in society today.
When we begin to talk about free speech, people often retreat to a defensive argument that it’s not a violation of free speech unless the government inhibits speech. But, free speech is more than just freedom from government interference. If people do not feel free to speak their mind due to the threat of retaliation, then speech cannot possibly be said to be “free” regardless of whether or not a law has been passed.
It must be pointed out that being “offended” is an emotion, and we are (or should be) in control of our emotions. The hurtful impact of words comes from within YOU, not from the external source of those words. When your mother told you, “Sticks and stones may break your bones, but words will never hurt you,” she wasn’t lying to you. She was describing how an adult SHOULD handle hateful speech in a mature and reasonable nature.
Have words ever made me angry? Yes. Have I sometimes lashed back in anger? Yes. Getting angry was not wrong; letting my anger affect my response was wrong. It is still wrong.
Should people generally try to be civil and not “give offense” to others? Absolutely. The world would be a better place if we all treated each other with more respect. HOWEVER, we can go a long way toward making the world a better place if we refuse to allow other people’s words to cause us to lash out—especially to the point of hunting down and killing those who offended us.
Is it permissible to condemn hateful speech? Of course! The same freedom of speech exercised by someone to utter hateful speech gives us the right to tell those people they are being hateful. Or, to put it another way, the freedom that someone has to call you a bootlicker is the freedom you have to call them a whingy coward.
This applies to discussions of religion, video games, political causes, the environment, race relations…everything. We must be free to speak our mind about anything, or we are free to speak our mind about nothing.
 Of course, there are more and more people calling for government inhibition of free speech through hate speech laws, hate crime laws, etc. My position on that argument should be made abundantly clear through this post.
 I’m trying to keep this post PG. The insults hurled around on Twitter and other places are much, much worse.
10. Everything is designed for touchscreens. Hey! I still like to use a keyboard and mouse once-in-a-while!
9. Internet memes. Especially how they are obsolete so quickly. I’m just now starting to get the hang of the “One does not simply…” meme and my son looks at me cross-eyed when I use it.
8. Switching “alternate” functions on the function keys. I use F2, F3, F4, etc. all the time, but now on many laptops they change the brightness or turn off my wireless instead of their correct functions. Drives me nuts.
7. Cloud services. Let’s see, I’ve got Dropbox, and Google Drive, and OneDrive… All our data lives online now and people wonder why we don’t have any privacy. Well, it’s because…
6. Hacking. Seriously, if the average person understood how often Web sites and online databases get hacked, the Internet would shut down from so many people disconnecting.
5. Pace of change. I got a relatively new model phone less than a year ago and it’s already antique. :sigh: Now I have to slog through using such an obviously cheap piece of junk for another year before I can upgrade.
4. One-button interfaces. Thanks, Apple, for your “click-wheel.” Now “easy-to-use” interfaces are infecting everything from my digital thermometer, to my electric toothbrush, and even my car.
3. Click-bait Web sites. And the celebrities who share them.
2. Anonymity on the Internet is making “social” media very anti-social. Some of the stuff that pops up on Twitter would never be said to someone else’s face because you’d get punched.
1. Our reliance on the Internet. If the WiFi goes out in the house, it’s a national emergency. Thank goodness we get good cell tower coverage…
 OK, some of these are funny.
 Let’s be honest, the click-wheel wasn’t that great a control scheme anyway.
10. Replayability. Really, if the game is short, it’s probably not interesting enough to play again. If it’s long enough to be interesting, it’s too long to keep playing it over and over. Make the game so I can do everything the first time. And, really, do game devs want us to play one game over and over? Don’t they want to sell us new games?
9. Epic plots. “There’s a giant hole in the sky spitting out demons! Can you please find my lost cow?” The best games have more personal stories.
8. YouTube walkthroughs. Edit. We do not need to watch you “walk through” cutscenes and endless meandering about looking for stuff. EDIT!
7. Delays. Companies have been making video games for 40+ years. You would think by now they would have figured out how to make reasonable estimates of how long it takes. On the other hand, delays are better than…
6. Bug-ridden messes. I can live with a Day 1 Patch. But it had better work after that Day 1 Patch…
5. 1080p. Apparently, if we actually want to read text in a game, we have to own 60”+ televisions. Speaking of which…
4. Text. If I want to read a book, I’ll read a book. Please don’t throw tens-of-thousands of words of text at me in the game, and expect me to read it to understand everything.
3. Nintendo 2DS. I know, I should have sprung the extra $60 for a 3DS, but, seriously, Nintendo? This hardware is cheap. You can do better than this. You had best do better than this or your quarterly losses are going to continue mounting.
2. #Gamergate However, GG is doing one good thing in trying to fight…
1. Censorship. If I don’t like the content of the game, I won’t buy the game. I may even tell other people my negative opinion of the game. But don’t try to block the game being made or sold; that way lies madness.
Actual side quest in Dragon Age: Inquisition
No, I’m not going to even try. GG is the consumer-side of gaming eating itself.
In the “bad old days” of the early Web, many people (myself included) preached constantly that you couldn’t design Web pages to any specific monitor resolution. The eggheads in charge of W3C came up with CSS to try to make Web design as display-agnostic as possible.
Of course, “we” lost. Designers are going to design to a specific format no matter what you tell them. They took CSS and used it to make their pages even MORE resolution-dependent. If you ever hit a Web page that doesn’t get the CSS quite right for your browser choice, you know what that means. Elements all layered on top of each other. Fonts too small to read or too big to fit in their “boxes” so you lose half the text.
Well, now the problem is creeping into video game design.
When I bought a PS3 several years ago, I bought my first “big screen” TV along with it—42”, 1080p, LCD. PS3 games look beautiful on it. So do Blu Ray movies. Heck, so do regular DVD movies. What has never looked good on it is a computer screen. When I have hooked my computer (via HDMI) to the TV and set the resolution to 1080p, the screen is unreadable from a normal (~10 feet) distance, unless you have very good eyesight (I don’t).
But, this wasn’t a problem with video games because video games in the 360/PS3 era were not “designed to 1080p.” High definition TVs were still rare enough and the gaming hardware still underpowered enough that games were generally designed to 720p—even if they eventually sent 1080p output to the TV, the original design was based on 720p.
Now, with the PS4 I see game designers have upped their “base” resolution to, at least, 1080p. Here’s the problem—what looks good to designers on their high-resolution, high-pixel-density monitors a couple of feet away from their face looks horrible on my HD, low-pixel-density television that’s ten feet away from my chair. Put simply, the text is too small; I can’t read it. And I’m not alone. On top of that, small elements in the game are easily overlooked. And if you don’t even have HD TV? Forget about it.
Assassin’s Creed IV. Wolfenstein: The New Order. Dragon Age: Inquisition. If I actually want to read any of the text in those games, I have to go stand a few feet away from the TV or yank my chair half-way across the room. What’s really maddening about this is it’s easily fixable. All that’s needed is a font-size adjustment either built into the game or built into the console’s OS.
Or, you know, game designers could start allowing for us old, impaired-vision folks who don’t have Retina Displays right in front of our noses.
 Seriously, pixel density (pixels-per-inch or PPI) matters. Here’s a Web site that measures PPI for you. My 5.2” 1080p smartphone has a PPI of 423—nearly twice human visual acuity. At 42”, 1080p is only 52 PPI. It makes a huge difference, especially with text.
 Don’t get me started on the tiny loot bags dropped by enemies in Dragon Age: Inquisition.