I'm happy to see that my post about the design decisions regarding Outlook 2007's RSS reader. To get everyone up to speed: Outlook 2007 RSS currently handles duplicate feed items very poorly, resulting in a huge number of duplicates flooding RSS "inboxes". Future builds of Outlook, including the final version, will have much more strict filtering of duplicate items. If a feed item is changed, but has already been marked "read" or deleted on the user's maching, the new item will be ignored entirely. I thought this was a terrible idea, as it meant that bloggers could be issuing very important corrections to stories, which would be ignored by Outlook. Outlook 2007 already has many problems with considering things duplicates, if a single detail changes. The new system is overkill, while the old one is a major time-waster. Neither is a good solution. I ask this: Having no personal knowledge of how to design good software, is it possible to configure Outlook to redownload the post, in the new system, if the actual text of the post changes? Ignoring every single other attribute of the post, if I delete or mark as read an item, can you still set it to delete if the article text itself changes? I hope Michael can explain whether or not such a thing is possible. Here's how he describes the turning-off of this feature:
If you don't like this delete model where removing a post from Outlook ignores all further updates on that individual item, there is an option to disable it. Toggling this option causes deletes to be non-destructive, and in the above scenario the user would see a new item downloaded for post B1 even though he deleted the original item in Outlook.I'm not entirely sure as to what that means. Does Outlook revert to the current, messy model, or something else entirely? I would love a deeper explanation. And thanks, Michael, for explaining this all to us. I love Outlook 2007, and have been using it for most of the year, and I want to continue to love it. By the by, here's what T has to say:
This is a good example of RSS aggregator developers attempting to fix a legitimate issue by doing readers an unintentional disservice. They are trying to keep duplicate posts from appearing in the aggregator which is a good thing, dupes most definitely suck, but their solution forgets about the many types of posts with updates.He recommends letting users decide, on a feed by feed filter level, how to treat duplicates. I like the idea of having minute control, but think that it will only make things very complicated. I have hundreds of feeds, and would hate if I had to set a bunch of preferences every time I added one. Maybe this could be a good option for power users. Add to DiggThis | Furl InsideGoogle blog, offering the latest news and insights about Google and search engines. Visit the
Suggest a Correction
Found an error or have a suggestion? Let us know and we'll review it.





No comments yet. Be the first to comment!