My colleague @Neil_H_NYC tweets, "Don, any comments on this TED lecture?"
I hadn't seen Pariser's talk, but it's a great one; his point being that Facebook and Google are among sites that use algorithms based on your own search and interaction patterns to give you feeds/results that are more like what you want to see. That's a good motive, but as he points out, it results in "bubbles," in which your worldview is limited.
Bubbles represent the current state of curation-by-machine--give me more like what I've gotten before. But part of the community concept I've advocated is the idea of human curation. E-commerce analysts already recognize that recommendation from a trusted friend is among the most powerful online shopping influences; the idea of news or search from a trusted friend is not really a stretch.
This recent blog post discussed this idea, and quotes a Mashable report in which an NYU professor says, "Curation comes up when search stops working."
In the near future, I'll be talking more about using the idea of community to see where social business and social networking is headed. The elevator pitch for that article is this: "technology is giving us the ability to make connected life like the very best parts of life in a village--where you know everyone, your rights and tastes are part of the fabric of everyday life, and every experience is personal and relationship-driven."
Computers are just not smart enough to replicate that--yet. But why do you think Google is Very Heavily Involved in artificial intelligence? One of the huge tech fortunes yet to be made will come from AI-based personal assistants--also the subject of an upcoming post.
In the meantime, filter bubbles reflect our own behavior: you can live in a gated online community as much as one created by a suburban developer, where the people you see and the ideas you're exposed to are all just like you, and agree with everything you believe in. We can choose otherwise, as Yes told us: "don't surround yourself with yourself."