How are in-depth NPC scripts made?

How do people make these heavy in-depth NPC scripts?

I’ve seen these scripts of AI pathfinding for searching for the player or hearing the player make sounds and being able to easily path find to the player’s correct location.

Where do they start? How do they implement other elements? Where should I start?

What I'm talking about

https://twitter.com/oTheSilver/status/1617587981803454491?s=20
https://twitter.com/oTheSilver/status/1609982926036545536?s=20
https://twitter.com/Rings0fSaturn2k/status/1625740031603228673?s=20

This is all done using what is known as a “State Machine.”

There’s a lot of different State Machines out there, and Roblox doesn’t have an intuitive interface to set one up, like Unity’s node based one.

But you can build one with code.
The idea is that the NPC will stay in a state, doing something, until a condition is met for another state. Once that state’s condition is met, it will go into that state, checking that state’s conditions and doing that states actions until the conditions to go to the next state is met.

States are things like take cover, search for target, idle, chase, attack, die, etc.

Conditions are things like target in sight, health is 0, health is 50%, in range of target, sound was heard, time has elapsed, etc.

These are the ideas behind them.

2 Likes

Here’s how NextBot NPCs from Valve’s Source Engine work, and I see this same structure used in Roblox AI frequently:


(replace NextBot with NPC here:)
Locomotion
This factor handles how a NextBot moves around in its environment. For example, if a NextBot was programmed to flee after being injured, it would rely on this factor to move to a different position in the map.

Body
This factor handles the animations of a NextBot. With the “oninjured” example, a NextBot would rely on this factor to play a flinching animation.

Vision
This factor handles how a NextBot sees certain entities in its environment. The field-of-view and line-of-sight functions mainly reside in this factor.

Keep in mind that this factor is not required for NextBots to work. A Skeleton in Team Fortress 2, for example, will find and attack enemies regardless of whether or not it sees them.

Intention
This factor is where the actual AI of a NextBot resides. The Intention factor manages the different behaviors a NextBot might have, and this factor is responsible for changing these behaviors depending on the event.

Behavior
A Behavior contains a series of Actions, which it will perform when the Intention factor chooses it.

This function can be considered to be the NextBot equivalent of a schedule, since they are both lists of certain actions the AI needs to perform.

Action
This features the actual AI code of a NextBot, which will run when its parent Behavior is run by the Intention factor. Actions can have an additional child Action, which will run at the same time as its parent Action.

This function can be considered to be the NextBot equivalent of a task, since they both contain the core programming which drives the AI itself.

Source: NextBot - Valve Developer Community

2 Likes

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.