Thursday, October 5, 2017

The Last of Us : Modularizing a Perception System


It is generally good AI practice to architect a codebase in a manner that reflects the rules of the world. The infected enemies in The Last of Us have tremendously blurred vision and instead perceive the world through an auditory system. Therefore, the AI code for the infected perception should contain a kind of system where events in the world generate "logical sounds" that raises the entity's awareness of the player.

I thought I would spend a fun couple of hours trying to see how this kind of a system would be modularized for scalability and reuse. Here is the barebones of the technical architecture I came up with. This was made with Unity and C#

1) SoundEvent - When an event in the world produces sound, it creates a SoundEvent object.


2) For example, when the player is walking, the PlayerController creates SoundEvent objects based on movement.


3) SoundPropogationManager - All sound events flow through this singleton manager. This is intended for a couple of reasons:

  • To ensure decoupling between sound generator and receiver.
  • To contain logic for things like dampening over distance and environment that may occule the sound etc.
  • It is singleton because it is almost like a static helper class. However, it may need to maintain state information.
a) soundEventTriggered - Fetches all components that are registered to listen for sound events, computes the dampened sound for each object and propagates them to the listening components.


b) IListenable - The interface that SoundPropogationManager identifies components that are listening for sound events. This ensures decoupling between the manager and component, essentially allowing multiple types of AI behaviors to listen for sound events.


c) getListenableObjects - The listening entities should be decoupled with the manager. Therefore, the manager should pull this data from the scene instead of individual entities registering with the manager (even if it comes with a performance overhead).

The manager may also register and unregister certain entities based on the position of the player and the entity. For example, if a player is in a room only other entities in the room need to be listening for sound events.


d) dampIntensity - The intensity dampening reduces exponentially over distance. This means that movement closer to an NPC has a much greater impact on the NPC than the same movement at a greater distance away.


4) ListenBehavior - A low level NPC behavior that listens for sound events. This behavior may be employed by multiple high level "skills". For example, a "wander" skill as well as a "sleep" skill may both call ListenBehavior. The behavior implements the IListenable interface. 


The result is captured in a video below. You can also find a link to the codebase here :



 





No comments:

Post a Comment