User:Shir21/sandbox

Decision Functions
Decision functions is a tool in Social Networks that helps compute the Decision Matrix. Using the Decision Matrix, we select the two opposing characters in a story (ex. Protagonist and Antagonist) as anchors and then we can find the decision vector of each node in the graph (character in the story).

Definition
"The decision function takes a party member function pm together with a social network which represents the logic of the narrative, and generates a decision matrix. This generates a decision vector for each node. The decision vector provides a “grade” for each option available to that node. The “grade” is given according to the total order $\succeq$ with a direction."With decision functions we would like to find the decision matrix, but there can be different ways to find it. For example, we can look at Gridland- the decision matrix will be created by using the distances from each node to an anchor.

In a graph it will be a similar case, we will use the distances between vertices to find the distance graph and use that to find the decision matrix. In a weighted graph we'll have better understanding of the distances between vertices (in an AB graph for example, it's safe to assume that two characters who talk a lot and therefore have a bigger weight to their edge are closer than characters who barely speak.) rather than a non-weighted graph where all adjacent vertices have the same distance. "Formally: Let n be the number of nodes in the social network G. Let k be the number of parties. Let $\mathcal{A}\subset\mathsf{V}$ be the set of known party leaders or anchors. The decision function is:$\psi: \mathbb{L}_P^A \times \mathbb{M}_{n,n}[\mathbb{R}]\rightarrow \mathbb{M}_{n,k}[\mathbb{R}]$"

Anchors
Anchors are characters in a narrative that we are well aware of their opinions, and so we can deduce the other characters opinions by relation to the anchors. "A node $v_i \in \mathsf{V}$ is an anchor if we know $\Psi_i$ before we compute the decision matrix $\Psi$. We denote the set of all anchors by $\mathcal{A} \subset \mathsf{V}$. Anchors will be used to compute the decision matrix $\Psi$."

Decision Matrix and Vectors
"The matrix $\psi\in \mathbb{M}_{n,k}[\mathbb{R}]$ is called a decision matrix when the rows are the decision vectors which “grade” or “rate” the various options available to the node according to the total order $\succeq$."Decision vectors are each node's rating of its relation to anchor. For example, if there are n anchors the decision vector for a node would be 1xn (In column 1 it would be its rating in relation to anchor 1, in column 2 it would be its rating in relation to anchor 2 and so on...).

Decision matrix would be a combination of all the vectors (In row 1 there would be the decision vector of node 1, in row 2 there would be the decision vector of node 2 and so on...).

Example- Gridland
Let's look at Gridland- a world that is 3 by 3 and that have 9 nodes that are separated by a certain distance, they have an election to decide where to place a nuclear plant. There are two parties- The blue party that elects to have the nuclear near node 1 and the red party thar elects to have the nuclear near node 9. Each node in Gridland can vote, which means there are 9 voters.

We'll find that the Decision Matrix will be defined according to the distances between the nodes.

$$\begin{pmatrix} 0&1&2&1&2&3&2&3&4\\ 1&0&1&2&1&2&3&2&3\\ 2&1&0&3&2&1&4&3&2\\1&2&3&0&1&2&1&2&3\\2&1&2&1&0&1&2&1&2\\3&2&1&2&1&0&3&2&1\\2&3&4&1&2&3&0&1&2\\3&2&3&2&1&2&1&0&1\\4&3&2&3&2&1&2&1&0 \end{pmatrix}$$

In this example each node distance from itself is 0, and from each neighboring node that is not diagonal it's 1.

Using this distance matrix, we can find the decision matrix. Each node would want the nuclear plant to be as far away from itself as possible, so naturally node 1 would be on the red team and node 9 will be on the blue team. They would be the anchors in this example as their opinion is the most obvious.

The matrix seen above with the two Anchors as defined will create this Decision matrix:

$$\begin{pmatrix} 0 & 4 \\ 1 & 3 \\ 2 & 2 \\ 1 & 3 \\ 2 & 2 \\ 3 & 1 \\ 2 & 2 \\ 3 & 1 \\ 4 & 0\end{pmatrix}$$

The first column shows how much a node would rate having the nuclear plant in node 1 and the second column shows how much a node would rate having the nuclear plant in node 9. Each row represents a node, so the first row is node number 1, second row is node 2, etc...

As expected, node 1 rates the option to have the nuclear bomb at node 1 as 0 but rates the option to have the nuclear bomb at node 9 as 4. Because the example we used shows a symmetric graph, the decision matrix and symmetrical as well. Nodes that rate both options as 2 have the same distance to both anchors (node 1 and 2) and so they would mind both options equally.