- Riley (right) and I at PAX East 2016, watching our game being played by fans for the first time.
Jacob here! I’ll never forget an unexpected text that buzzed on my phone over three years ago. At the time, I was working in the AAA game industry, but that particular message would change my life forever in a storyline that leads me all the way to this blog post, and beyond. Paul (who with Adam invented the original concept for Astroneer) was one of my favorite people in the entire industry, and in characteristic Pauline brevity the message read simply, “Want to meet up this week and talk about a business proposal?”
Years before, I had gotten to know Paul when he would saunter over from his artist den (where he led a group of 5 or so artists responsible for much of the beauty of Halo 4) to check out work I was doing on lighting technology. The ironic grumble he would churn out at the artistic concessions I imposed would become a familiar and recurrent micro-interaction, and which I now remember fondly as some of my favorite times in the industry. Today, I’m announcing that I’ll be moving on from the game industry, to work on a whole different set of challenges.
As many of you know, Paul passed away unexpected recently, an event which shocked the entire industry and leaves a huge hole in the heart of System Era. I was already in the process of exiting the company at the time, and I am grateful beyond measure for the conversations we had before his passing about life, the universe, and everything. He truly believed in the reasons I have for moving on, for my new Journey to Another System.
A few years ago, Paul and I went to go see a talk by Nick Bostrom, a philosopher from Oxford, about his brand new nonfiction work “Superintelligence.” The book is the culmination of decades of thought experiments about the future of Artificial Intelligence, and an urge of caution for an field and industry that is yet one of the youngest in existence. I had been following the arguments for a few years and thus seeing him talk in person was a catalytic event. In the time since, the topic has become a hype machine of untold proportions, and yet the problem Bostrom writes about will remain (perhaps forever) as one of the most central questions of humanity’s future. I’ll never forget when Paul remarked then, in his characteristically logical and innocent form, that “Humanity should probably be paying attention to this.” His (massive) library still has a signed copy.
Around the same time as the lecture, I quit my previous job to spearhead the engine development of Astroneer, which has been the most fulfilling challenge of my lifetime. In the time since, I took on the role of the project’s Designer, and found myself responsible for a terrifying amount of the game’s core gameplay experience. Somewhere out there you can find a video of me, at peak stress levels, hosting a pre-launch Twitch stream about what features we were going to cut for our Early Access. I’m immensely proud of what the team was able to pull off, and of the reception for this (still massively unfinished!) game, and the team we built around it.
Lest anyone think that the game is not in good hands, the family I had the pleasure of working with at System Era remains one of the most respectable and passionate teams in the industry. (If you haven’t yet applied to work there, do it now!) The last team member we hired before I left, Aaron, is the one person on the entire planet I could ever feel comfortable leaving behind such a great design responsibility. He’s worked tirelessly to ensure that, if within the soul of the Astroneer design philosophy there was anything locked away in my head somewhere, that he’s extracted it and that the gameplay will continue to flourish (now unfettered by my impossible idealism!). I’m excited to see what System Era comes up with and play Astroneer alongside each of you, as a fan.
So, after three solid years of work on Astroneer, I’m off to “save the universe” (as the team and I joke). I’ll be working in the fledgling field of AI Safety, which is a scattered group of researchers and organizations who take seriously the concerns and risks arising from successful machine intelligence. I’d encourage anybody interested in AI to consider the arguments carefully, to ignore the hype, to read Science Nonfiction rather than Science Fiction, and to think about how much is at stake for humanity’s future. (And to also heckle me on Twitter about it.)
Looking back, there’s a surprising amount of overlap between what we’re doing on Astroneer and what I’m working on now. The thing I love about Astroneer is how thematically aspirational it is, both to its emotional depths and also to the sky, stars, and beyond. Humanity is ceaselessly mediating the boundary between its internal existential questions and its external aspirations. These are exciting times both for age-old philosophical questions and for brand new explorations and frontiers. I look forward, along with each of you, to sit down at yet another day of questioning and learning, to crack open the finished version of Astroneer and find what inspiration we will among the stars.
See you around the universe[s],