Dr Robin Sloan teaches game design and development practices at Abertay University, where he is the programme leader for BA (Hons) game design and production. He has published articles on game design, games culture, and educational games.
The Saint: What was it about video games that first interested you and made you want to make them?
Dr Robin Sloan: Computers, rather than consoles were the gaming platform of choice in the 1980s. I was fascinated by the peculiar games that emerged on the Spectrum and Commodore, like Manic Miner, Dizzy, and Boulder Dash. The most influential game on my career, however, was Starflight, which I played on the Mega Drive. It drove me to consider game design as a career choice.
TS: Looking back on when you first started in the industry, are you surprised by the amount of development and progress that video games as a medium and the industry as a whole have made? What do you think the next big steps for video games might be?
RS: My first professional role making games was in 2005, when I was recruited as a “Game Artist.” 2005 was a time when the business and operational models for game development and publishing were still rather limited. Games were generally made by medium to large studios, comprised mainly of experienced developers. While being an excellent candidate is still a requirement for today’s graduates, I think it’s great to see so many opportunities for internships and more visible junior roles. As an industry, we have a much better approach to training and bringing in new talent. We also have a much more diverse outlook on how games are made, who they are made for, and how they should be bought and played. It has been particularly interesting to observe the rise of what we generally term “indie” games. In 2005 these games were very much on the fringes of games culture. From the mid-2000s onwards, indie games were hugely supported by platforms such as Steam. There are now many more ways to make games and make money from the process. Even 1-person “studios” can develop and launch successful games, and these games sit on digital storefronts alongside games made by studios of 200+ developers.
TS: How important do you think video games are to the culture of modern society, and do you think a certain amount of responsibility comes with that?
RS: It has been stated many times before, but digital gaming — as a broad medium encompassing everything from console-based action titles through to obscure art games for browsers — is arguably a defining medium for the 21st century. They allow us to become actors within a given context: to make choices and see what the consequences of those choices might be. In this sense, I think digital games are a very important medium to modern society, and central to the digital humanities. As Ian Bogost has argued, they can be used as a form of procedural rhetoric, which does infer some responsibility on the part of the game designer. Games can be used to reveal complex meanings, to make arguments, and to convince us to reflect on our humanity, our attitudes, and our behaviours. In a climate of fake news — what used to be properly called propaganda — game makers need to think about what messages and ideas they are building into game worlds.
TS: Finally, do you have any professional tips on how best to proceed if they want to start getting into the industry of game development?
RS: Most arts and science degrees will prepare students for critical thinking in a creative role. What is perhaps missing is the fundamental knowledge base pertinent to games, and also — the essential part — experience of game development. Games degrees focus on these two areas. And for roles in art and in programming, a deep knowledge of digital art techniques or computer programming is essential. Now, a history or philosophy graduate may be well placed to consider postgraduate study in design, while a business or management graduate could also look at production, and a physics or mathematics graduate could potentially be looking at software engineering to enter into games programming.