**Semantic Vectors for Recommenders**

Start with a User->Item preference matrix. The values range from -0.5 to 0.5 in “linear” interpretation: 0 is neutral. -0.5 is twice as negative as -0.25. Blank values default to 0.

Item1 | Item2 | Item3 | |

User1 | 0.3 | 0.6 | 0.9 |

User2 | 0.1 | 0.4 | |

User3 | 0.7 |

Now, let's do a semantic vectors projection. The User vector is random:

Random | |

User1 | 0.2 |

User2 | 0.6 |

User3 | 0.9 |

The formula for the Item outputs is:

Item(i) = ((sum(U)+ sum(pref(u,i)/#U))/2)/#U

Where #U = the number of users expressing a preference for Item(i)

I1 | Not enough preferences |

I2 | (((U1 + U2) + ((pref( u1,i2) + pref(u2,i2))/#U))/2)/#U |

I3 | (((U1 + U2 + U3) + ((pref( u1,i3) + pref(u2,i3) + pref(u3,i3))/#U)/2//#U |

I1 | No vector |

I2 | (((0.2 + 0.6) + (0.6 + 0.1)/2)/2)/2 |

I3 | (((0.2 + 0.6 + 0.9) + (0.9 + 0.4 + 0.7)/3)/2)/3 |

I1 | No vector |

I2 | 0.2875 |

I3 | 0.3944….. |

The resulting semantic vectors projection:

Item1 | Item2 | Item3 | User Vector | ||

User1 | 0.3 | 0.6 | 0.9 | 0.2 | |

User2 | 0.1 | 0.4 | 0.6 | ||

User3 | 0.7 | 0.9 | |||

ItemVector | 0.2875 | 0.3944… |

Here is a very difficult-to-read graph of the relationships:

**Recommendations**

The recommendations for the users are their vector’s distance from each item vector:

·

User1 would be most interested in Item3, and finally Item2.

User1 would be most interested in Item3, and finally Item2.

·

User2 has interests the same order, but would find the whole list less interesting.

User2 has interests the same order, but would find the whole list less interesting.

·

User3 would be most interested in Item2, then Item3.

User3 would be most interested in Item2, then Item3.

**Item-Item Similarity**

The Item-Item similarity set is the distances between the

item vectors. Unfortunately, since item 1 has only one preference, it has no

vector projection.

item vectors. Unfortunately, since item 1 has only one preference, it has no

vector projection.

Item2:Item3 = .11

**User Vectors**

The User-User distances are random, there is nothing to be learned.

**Summary**

The Semantic Vectors algorithm takes a matrix of Row->Column relationships and creates a set or Row-Column distances and a set of Column-Column relationships in a new, common numerical space. In this example, we created two sets of recommenders, a User->Item recommender and an Item->Item recommender.