Yves Jacquier oversees innovation in various production departments at Ubisoft Montreal. Four years ago, he helped Ubisoft to take ownership of new technologies by creating La Forge, a space where external academics and Ubisoft team members gather to work on research projects and prototyping.
In early March, Jacquier was meant to attend UNESCO’s Mobile Learning Week, a United Nations flagship event on information and communication technology in education, to present Ubisoft La Forge’s model and its collaboration on the Visualizing Climate Change project led by Mila (Montreal’s Institute for Artificial Intelligence). While the UNESCO event was cancelled due to the evolution of the COVID-19 pandemic, the project is moving forward and the data set produced by Ubisoft La Forge is now available publicly on GitHub.
We reached out to Jacquier to learn more about his role at Ubisoft, the La Forge model, and how Ubisoft participates in the Visualizing Climate Change project.
Could you tell us a bit about yourself and your role at Ubisoft?
Yves Jacquier: I started my career in particle physics at CERN (European Organization for Nuclear Research). I also worked in medical instrumentation, telecoms, and even music. I joined Ubisoft Montreal 16 years ago, and am currently heading the production services departments, which essentially consists of guiding innovation in these departments.
For example, to produce motion-capture data, our production pipelines must be as efficient as possible, but we need to keep exploring, and to test, integrate, and improve them to be ready for future productions. We’re always thinking of the future, making sure that we’ll still be leaders in cutting-edge technologies in five years. As a result, my responsibilities have naturally evolved to help Ubisoft take ownership of new technologies such as telemetry, biometry, streaming, performance capture, and now, of course, machine learning.
Could you describe La Forge’s mission?
YJ: La Forge was created four years ago to accelerate research and development at Ubisoft. It’s a place where external academic researchers and Ubisoft team members work together on high-potential prototypes that serve both interests: for us, improve our games; for them, to enable scientific publications. We were the first in the gaming industry to develop such an R&D recipe, where researchers have access to all of Ubisoft’s technologies, including game engines and data, as well as the expertise of our specialists. In exchange, we get access to, and the chance to participate in, high-level academic research.
La Forge was created as an interdisciplinary space by design. And because of that, we discovered that not only were we able to work together to improve our games while creating public knowledge, we also contribute to solve real-world problems.
Do other companies have initiatives like La Forge?
YJ: La Forge was unique in the industry at the time it was founded and helped productions benefit from some useful technologies. We had a strong leadership role in machine-learning in the industry; for example, we were the first to integrate systems that use voice to generate facial animations in a triple-A game, like in Assassin’s Creed Odyssey, or to generate animations on the fly in our engines. Today, the Ubisoft La Forge formula has inspired other studios and industries, which is more evidence of our success!
Can you tell us more about some of La Forge’s projects in AI?
YJ: We tend to forget that artificial intelligence and videogames are both 70 years old, and they’ve been developing side by side this entire time, one feeding into the other. Today, the quality of AI in videogames is impressive, but there are also many disruptive AI applications outside of videogames. Most recent real-life innovations are supported by AI, for example autonomous vehicles, virtual assistants, or e-learning.
These are all possible now because one type of AI, namely machine learning, has made tremendous progress over the last five years. We noticed that some of these innovations have a lot in common with videogames, making us realize that we face similar challenges to industries and disciplines different from ours. We need our NPCs to find their way just like robots do; we need our virtual vehicles to drive autonomously and adapt; we need to generate rich and believable dialogues or characters that convey a wide range of emotions.
Thanks to our collaborations, we were able to solve some problems with machine learning in our games, help to improve public knowledge by supporting publications, and discovering new applications outside our field.
What are some of the projects that La Forge has been involved in recently?
YJ: We worked with a technology called biometry to create models capable of easily predicting an individual’s cognitive charge, in order to get quantitative feedback on our players during playtests – for example, to see what breaks the “flow” when playing. This project also benefited educators in the fields of e-learning and adaptive learning to get a better sense of how students most efficiently learn when they interact with a screen. The project resulted in many articles and prototypes that are used in Ubisoft Montreal’s User Research Lab.
Another project is a prototype called sound matching, developed two years ago, which consists of “automatically” animating the lips of characters based on their voices. This process works in all languages and has been integrated into several games, including Assassin’s Creed Odyssey and For Honor. It assisted the localization teams, but also contributed to an increase in the sense of immersion for players, thus boosting the quality of localized cutscenes.
As it turns out, the challenge of immersion is also important in other fields, for example in some mental-health therapies. Some people are developing a method to treat patients suffering from anxiety or schizophrenia by confronting them with their anxieties. They use virtual reality to create avatars that interact with patients in real time. Naturally, they need to make these avatars as believable as possible, and one thing that can break that connection is facial expressions.
We initiated a collaboration with researchers in this field to measure our capacity to create a connection with an avatar, while sharing our own learnings and technology.
Ubisoft La Forge recently contributed to a project led by Mila (Montreal’s Institute for Artificial Intelligence) revolving around climate change awareness. Can you describe what this project is about?
YJ: This project is led by Yoshua Bengio, a well-known figure in AI – especially for his work on deep learning – and a long-term partner of La Forge. The Visualizing Climate Change project aims to make climate change more concrete for everyone by using Google Street View to generate a “flooded” version of an individual's address. The concept is that you can enter your address and, if you live in a floodable zone, the application will generate a flooded image of your home and the street where you live in 2050 using sound scientific climate models. You can see it as a time machine that is able to show you an accurate version of your neighborhood in 2050.
What did Ubisoft contribute to this project?
YJ: To generate the images, the AI system needed to be trained by being presented with examples of flooded and non-flooded scenes, until it could generalize a set of rules that could help it to predict what certain features look like when the level of the water rises.
One of the issues was the lack of examples to train such models, since luckily there are not many real-life examples of non-flooded areas that suffered flood damage over the years. We used assets from the version of the San Francisco Bay Area in Watch Dogs 2, generating flooded and non-flooded versions of certain places in the game engine to feed more images to the AI and help it learn.
One of the questions we had on our end was whether the images generated in the game engine would be believable enough to train the AI model. What we found was that, when the AI had learned by using a combination of real-life images and our in-game images, it produced much-improved results. It also allowed us to explore the limits of the realism in our game engine.
In your opinion, how could this project eventually be used for educational purposes?
YJ: We humans have a cognitive bias that makes it difficult for us to make decisions when asked about long-term impacts, since they seem very abstract. Therefore, if we want to educate people about climate change and make them conscious of what they can do to avoid it, we must show them concrete examples they can relate to. The idea with this first project is to show the consequences of flooding, but eventually, the underlying technology could be used to generate simulations involving other risks of climate change.
What are the next steps for this project?
YJ: The first phase ended with the release of the simulated flood data set available publicly on our GitHub. Researchers can use the data we've provided to improve the project's algorithm and train new AI to create even more believable images. As far as I can tell, it’s the first time that Ubisoft has made this kind of data set open-source. These are assets from a video game, from a product we sell, so it’s a pretty big deal to share them publicly, and it shows our commitment to research in general, and to this project in particular. We are now thinking about other ways we can participate, and other types of data that we can provide for similar initiatives in the future.
As for Mila’s Visualizing Climate Change initiative, the results are not yet open to non-participants at the moment, as they’re still refining the AI. However, everyone can participate by sending in images of flooded houses and streets to continue training their climate model. We learned a lot about our own technology participating in this project, so I can’t wait to see where they take it next.
For more on La Forge and other Ubisoft projects and initiatives, check out our Inside Ubisoft coverage.