Sims running on ChatGPT are a glimpse into the social future of AI



summary
Summary

Characters in video games could soon feel even more realistic. But these AI sims could also be helpful outside of gaming.

In a new paper, researchers from Google and Stanford University simulate human behavior using large-scale language models. The paper, titled “Generative Agents: Interactive Simulacra of Human Behavior” relies on ChatGPT and offers more than just an exciting glimpse into the future of video games.

25 AI Sims live in their own world

Inspired by the game series “The Sims”, the scientists have set up a sandbox environment in which 25 AI agents go about their daily work completely autonomously.

They gave each character a brief, 1,000-character description of their profession, personality, and most importantly, their relationship to the other residents of “Smallville”. In the case of AI agent John Lin, for example, it looks like this:

ad

John Lin is a pharmacy shopkeeper at the Willow Market and Pharmacy who loves to help people. He is always looking for ways to make the process of getting medication easier for his customers; John Lin is living with his wife, Mei Lin, who is a college professor, and son, Eddy Lin, who is a student studying music theory; John Lin loves his family very much; John Lin has known the old next-door couple, Sam Moore and Jennifer Moore, for a few years; John Lin thinks Sam Moore is a kind and nice man; John Lin knows his neighbor, Yuriko Yamamoto, well; John Lin knows of his neighbors, Tamara Taylor and Carmen Ortiz, but has not met them before; John Lin and Tom Moreno are colleagues at The Willows Market and Pharmacy; John Lin and Tom Moreno are friends and like to discuss local politics together; John Lin knows the Moreno family somewhat well—the husband Tom Moreno and the wife Jane Moreno.

“Believable individual and emergent social behaviors”

The sim characters exhibit “believable individual and emergent social behaviors” once they are released into the virtual world, the team says. One example is a Valentine’s Day party initiated by an individual resident.

Image: Park et al.

Within two virtual days, the Sims invite themselves to this very party, arrange to meet, and show up at the appointed time and place-not all, of course, because just like real people, some of the Residents don’t feel like partying .

Complex behavior is enabled by the underlying architecture: Each AI agent has a memory and observes, plans, and reflects on its environment and social relationships.

This loop of observation, planning, and reflection drives the interactions of AI agents and determines their character. AI agents can reflect not only on their observations, but also on their plans and past reflections. For example, Klaus Müller’s self-image as someone who is highly dedicated to research is the result of several independent observations and reflections.

Image: Park et al.

To make this process work, all items are stored in memory and can be retrieved at any time. In addition, to ensure that AI agents act and interact with other agents in a meaningful way, each memory is scored according to three aspects: “Recency” assigns a higher score to memories that were recently accessed. “Importance” is a value that distinguishes between mundane and core memories, eg between breakfast and the end of a relationship. “Relevance”, on the other hand, assigns higher scores to memories that are related to the current situation and thus serve as a meaningful context for further action.

Recommendation

“Smallville” in a demo.

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top