Limits of Branded Social Networks – Blogs

While our product is far more than a branded social network, we realize that unlike many social networks where the company providing it is merely a conduit that doesn’t care what its users say and do, our network will be paid for by political campaigns and organizations who may want to control what goes on in their web space. The problem here is that if an organization begins to noticeably intrude on its social network, censoring posts, comments, and images beyond enforcing decency standards, its users will flee and make their voice heard on unrestricted channels.

In looking at most of the campaign web sites that I see this election cycle, I don’t see much authenticity on major campaign’s web sites. I’d love to see more large campaign blogs actually written by the candidate featuring posts that read like more than just a press release. I have little reason to believe that campaigns will expect anything less from every user of their social network – conversations with nothing but praise for the candidate and their policies, messages, and advertising.

In an ideal world, branded social networks will facilitate open and honest conversations with individuals, people with independent opinions that generally support the group whose network they’ve joined, but may disagree from time to time. Clearly, we need to make it easy to identify and silence people who are only out to harm the brand. We’d also like to make it easy for supporters to make their honest opinions known, but I fear we’ll spend more time building censorship filters for some community features than it takes to build them in the first place. Organizations will be better served if they can look within their own network of supporters to understand where their agenda differs from that of their supporters.

Time will tell. Of course I won’t always be able to be specific in calling out specific examples, but I’ll gladly talk about general errors in this area, praise open conversations, and look forward to building community features that will facilitate these discussions.


Netflix Prize: Best Rated and Most Rated Movies

I’m currenty intrigured by the Netflix Prize – if anyone can improve their score prediction engine by 10%, they win $1 million. While we sliced a lot of data for predictive modeling at Proficient, I don’t expect to have the expertise to solve it, but I love messing with data, so in the mean time, I’m playing with their files of 100 million ratings – 2 GB uncompressed, that they provide for contest participants to train and test their algorithms against.

The first interesting data I’ve derived from this dataset is the highest rated and most frequently rated movies.

The best rated – (score), Dataset average 3.6043

  1. Lord of the Rings: The Return of the King: Extended Edition – 4.72327
  2. The Lord of the Rings: The Fellowship of the Ring: Extended Edition – 4.71661
  3. Lord of the Rings: The Two Towers: Extended Edition – 4.70261
  4. Lost: Season 1 – 4.67099
  5. Battlestar Galactica: Season 1 – 4.63881
  6. Fullmetal Alchemist – 4.60502
  7. Trailer Park Boys: Season 4 – 4.6 (
  8. Trailer Park Boys: Season 3 – 4.6 (
  9. Tenchi Muyo! Ryo Ohki – 4.59551 (
  10. The Shawshank Redemption: Special Edition – 4.59338
  11. Veronica Mars: Season 1 – 4.59208
  12. Ghost in the Shell: Stand Alone Complex: 2nd Gig – 4.58636
  13. Arrested Development: Season 2 – 4.58239
  14. The Simpsons: Season 6 – 4.5813
  15. Inu-Yasha – 4.55443
  16. Lord of the Rings: The Return of the King: Extended Edition: Bonus Material – 4.552
  17. Lord of the Rings: The Return of the King – 4.54512
  18. Star Wars: Episode V: The Empire Strikes Back – 4.5437
  19. The Simpsons: Season 5 – 4.54256
  20. Fruits Basket – 4.53891

The Most Rated – (number of ratings), 100,480,507 total ratings

  1. Miss Congeniality – 232944
  2. Independence Day – 216596
  3. The Patriot – 200832
  4. The Day After Tomorrow – 196397
  5. Pirates of the Caribbean: The Curse of the Black Pearl – 193941
  6. Pretty Woman – 193295
  7. Forrest Gump – 181508
  8. The Green Mile – 181426
  9. Con Air – 178068
  10. Twister – 177556
  11. Sweet Home Alabama – 176539
  12. Pearl Harbor – 173596
  13. Armageddon – 171991
  14. The Rock – 164792
  15. What Women Want – 162597
  16. Bruce Almighty – 160454
  17. Ocean’s Eleven – 160326
  18. The Bourne Identity – 158601
  19. The Italian Job – 156183
  20. I, Robot – 155714

If there are other interesting stats you’d like derived, or you have fancy ideas on how to best predict a user’s rating, drop me an email or comment.