Introduction
This is the final part of a long series of questions and answers, and I have been working with a university professor of mine. In case you haven’t read part 1 and part 2, the idea for “Conversations on History and Gaming” sparked when a university professor of mine was doing a presentation on the importance of video games and how they relate to History, and came to me with several questions. I have taken my answers, added some extra commentary, inserted a couple more topics, and put it all together to give you something that has value for both people inside and outside academia.
For the third and final part, I wanted to discuss how we (players, media, and teachers) should look to the future, the importance of digital literacy, and why History university courses should consider creating subjects on Historical Media. If you’re interested in reading part 3, you really should read the first and second parts for further context.
As a final reminder, these questions were made by a person whose career was not built around videogames, and was genuinely curious about them, so a couple of the questions might seem a bit off, or strange, but that’s what makes it so interesting and provocative, giving us a glimpse at what a real academic mind is when it comes to videogames, and which doubts exist, and how they perceive this strange new world.
Historical Media, Digital Literacy, and the Limitations of Gaming
What’s “digital literacy” all about? And why it might be a limiting factor?
Digital Media, as a whole, and games in particular, do have a higher barrier to entry for your average person. For books, all you need is a book, which are relatively cheap, and to know how to read. For movies and television, all that is required is a TV, and access to those has long been widespread for well over 70 years now. Games, however, are a bit more complicated.

First off, while gaming has become a lot more widespread in the last two decades, with nearly half of the world population playing some sort of game (yes, I know that most are mobile), there’s a massive demographic discrepancy between generations, with older gens playing significantly less than the one that comes after, and so on, and gaming might eventually become as ubiqutous as TV and books as soon as Gen Alpha enter the workforce, some 10 to 15 years from now, as over 80% of them engage with videogames every week.
But gaming’s issue doesn’t end there. Overall, the audience is quite diluted across platforms and genres. Mobile dominates the market, with over 80% of players gaming there, 26% play on PC, and 18% play on consoles. The reason these percentages overlap is that around 70% of players tend to play on one or more platforms.
Things are even more convoluted when it comes to gaming genres, where favorite genres can oscillate widely year by year, and by how popular a genre is during that specific time and place. For years, the gaming scene had been dominated by games like flight sims and strategy, then FPS took over, RPGs and MMOs had their time in the spotlight as well, and MOBAs have come and gone; some stayed. Survival was relevant for a couple of years as well, then Soulslike started to become the norm, and the wheel goes round and round. This is, of course, pretty normal, and the same phenomenon can be seen in books, movies, and music. However, games are substantially more expensive, with an average spending of $50, when compared to an average price of $5 to $10 on a book, and a $15 subscription to any streaming service. While I see a lot of people make the argument that games are a lot cheaper if one measures entertainment by hour, not once in my life I have seen someone who’s outside of gaming look at things from that perspective, so I don’t consider that to be a valid argument, as there’s still a very strong up-front payment for something you might not be totally sure you’ll enjoy.
To add even more fuel to an already expensive fire, computers and laptops do cost quite a premium, with entry-level PCs going for at least $800, and a mid to high-level machine costing around $2000.

All of this to paint, in broad strokes, the gaming scene as it is in 2026. Fortunately, with the coming of new, more digital generations into the fold, digital literacy is a diminishing issue, and less of a barrier to entry than it was a decade ago, but I do still think it’s worth going over some of the limitations it certaintly brings, especially if you take into consideration that not everyone has the same level of familiarity. Even within the gaming community, the level of digital proficiency can vary quite extensively from person to person. Some people can run and operate a pc just fine. Most just want to run a couple of games, and that’s it. Others might face some issues when troubleshooting simple problems, for example. Some can do all of those, but struggle with modding tools, be that a lack of interest or familiarity. My point is, digital literacy can oscillate dramatically, so here are three of the prerequisites I think History professors should have when attempting to bring media (games) into the classroom, in a competent and interesting way.
- Level 1: Be familiar with working with a computer, installing, running, and troubleshooting games;
- Level 2: Be informed about the games you’ll be running. Know a game’s limitations and strong points;
- Level 3: Be capable of modding the game (if applicable) or using in-game editors to fit whatever message you’re trying to pass.
If any professor is interested in meeting these 3 high-level pre-requisites, they would be able to turn a boring class into something interesting and worth attending. Why just read about some military tactic, if you can use a game to show it in action, for example? Of course, even within those prerequisites, there’s a lot of work, and a lot of levels in between each, but a basic proficiency with all 3 would pretty much solve 90% of every issue/doubt that might arise, and provide students with a smooth experience.
Why should Universities care about games, at all?
Having studied both History and Media at the university, I have always held the opinion that a class on Historical Media should be, at least, an optional class to take during any university course. That’s not a class on the History of Media; we do have plenty of those. It’s a class where a professor would go over some of the major games, books, and movies that portray History, and properly criticize them for their virtues and flaws, and explain how historical perception can be changed and molded by media, over time.

History is filled with myths, and we have even tackled some of those that have made their way into gaming, and they’re now taken at face value as historical fact.
What do games need to be consider historical?
I don’t think there has ever been a single person who has taken a look at what kind of criteria a video game needs to be considered “historical”. During my interview, I mentioned that “thousands of games do claim to be historical, but to claim and then to properly be are two very different things”. Games like Age of Empires 2 and Call of Duty (look at their Steam tags) do claim to be historical, and to a certain extent, they are, given that they take place in a specific period of human History. But can we consider them a good representation of History? I don’t think so. So the question needs to be a bit broader, and we need to think about what the criteria are that make a video game a good representation of History.
So, for a game to be historical, it needs to have a lot more than just to be set “sometime” in History. Do I know the answer to that question? Not really, and I don’t think anyone can postulate a definitive criterion for what makes a game worth being considered “historical”. At least, no postulation will be done without flaws. With that in mind, I’ll set forth what I think.
- Criteria 1: Temporal setting – It needs to be set in a real, well-defined historical setting (i.e., 15th-century medieval England). The time and geographical scope should not be limited.
- Criteria 2: The game’s rules must emulate the constraints and rules of the systems (economic, social, militarily, techonological, etc.) of the time and place it’s trying to recreate.
- Criteria 3: The player must have agency to change historical events the majority of the time.
- Criteria 4: The impact of this agency needs to produce historically plausible outcomes.
- Criteria 5: It must have some semblance of material fidelity, such as language, technology, architecture, clothes, and art, depending on the level of abstraction of the game (the lesser the amount of abstraction, the higher the level of material fidelity).
I do think that arguments for and against can be made against each criterion, but overall, that’s a solid proposal that should cover most bases and provide a framework for future discussion. This is quite a touchy subject, and one that needs a much more in-depth look that falls outside of the scope of this article. As I said, this is quite an interesting topic for an aspiring student looking for a challenging doctoral degree.
Games do have limitations. What are some of those?
Games have plenty of limitations, but two balancing acts are tricky to pull off successfully. The first one, I would argue, is the necessary equilibrium between what’s real and what’s fun. An age-old debate that presupposes the idea that the more realistic a game is, the harder it can be to interact with its systems, as there are more at play, they’re interconnected, and it can be quite hard to design an engaging experience around them. While this has remained true for the longest time, I also think that things are getting better, with a lot more attention being paid to the experience the player has with the game’s user interface, and the use of automated systems allowing for some games to be quite authentic while avoiding the sacrifice of mechanical depth, with Europa Universalis 5 being the most recent (and impressive) example of that.

The second one might be a lot trickier to fix. It’s the balancing act between historical railroading and player agency. History already was. There’s no changing that. So, how do you design a game that perfectly navigates between what was and what might have been? While I can see some valid arguments arise in the case of this balancing act, I also think we have been looking at this question all wrong from the start. Take, for example, the Battle of Normandy. Should the player be forced to play the battle “as was”, day by day? Possibly, yes. But I think games are at their most interesting when the player is the one in command, and not being guided by an invisible, all-knowing, and all-powerful hand. There are, certainly, games that would benefit from that approach, such as first-person shooters or small-scale level titles, that only zoom into a specific part or area of a given event. But that cannot be extended for grand-strategy titles, where the role of the player can last for hundreds of years, has no breaks between scenarios, and will face a larger degree of deviation from historical facts the larger the distance in relation to the starting point of the game.

I think that a game should provide the player with the necessary tools at the start of a scenario, and then let it run with those tools, be they what they may be: Orders of battle, historical characters, decisions. etc. As for the outcome, it should not be a concern unless the deviation from historical reality is too much to the point of ridicule. However, I don’t think a lot of games even allow for that.
Then, how do we balance free player agency and historical facts?
This is only a problem when things are looked at from a perspective of opposition. Is a player agency and an alt-historical outcome a negative? Why? The historical reality is already there; we know what it is, the game doesn’t change it. The biggest advantage of playing historical titles is the interactive nature of the medium, and that already implies the presuppositions that no action will ever be truly historical. And that’s ok. It’s not the goal. The goal is to immerse players and students in the historical reality that led to those outcomes, in order to better understand the decisions that were made and how things turned out the way they did.
Conclusion
I hope you have enjoyed part 3 of this long article, and a couple of you have even emailed me about it, wondering when I would be able to conclude it. I still think Universities have a long way to go in order to properly incorporate media into teaching, especially video games. It’s a massive opportunity to turn classes from simple, often boring lessons into real learning opportunities, and maybe even convert a couple of people into readers of Strategy and Wargaming.
pport Strategy and Wargaming
I do what I do in Strategy and Wargaming because I love to do this, and I’m never going to stop. If you would like to support me with that, you can buy me a coffee for a dollar if you’re feeling generous. If you can’t, no worries, Strategy and Wargaming will always be free, and I’d love to have you around!
