How would you find the point where it would have the least deviation

So this is a hard question to ask but what I am trying to say is how would I get a calculation where it will look for the center of a cluster of given player positions. For example, I would want this brown brick to look at the place with the most players (represented as spheres in this example) that are bunched up together without drifting to the side and looking at the single point outliers. What I would want it to be looking for would be the average of the circled clusters based on their close they are (magnitude) to each other and closest to other points outside of that cluster.


(image of example mentioned)
Basically, I am trying to create a part that would look at crowds of people and find the optimal position to look at(currently I am using a mean average deviation formula, but it has the problem of drifting to extreme outliers). I would like to know how I would go about finding this “optimal spot”, as I stumped by it right now.

Hello!
You question is not trivial at all, as it is a common issue in data science and other fields! There are many resources in the internet, feel free to make a quick search. I recommend you dig deeper in this subject if you like it.

The algorithm you are looking for may depend on your specific use case, on whether you prefer accuracy over performance or not.

1 Like