I agree with pubcrawler's comments. There needs to be more page-by-page analysis, instead of site-wide penalties. This seems to be at the core of many complaints on this page.
My website contains a mix of reprint articles (small percentage) and original content. About 15% of the site consists of reprinted work from article-distribution websites. But I gave up that publishing model years ago. Ever since then, I've been publishing 100% original work. These are useful, in-depth tutorials that often top out at 1,000 words or more. This original work makes up the vast majority of my website. Yet, my entire site has been penalized because of (I assume) a small percentage of reprint articles. There are no tricks on my site -- no black hat nonsense or onsite duplication. Just articles. Most of which are original, useful pieces. It seems unreasonable that the entire site would "downgraded" because of a small portion of the content.
The question many are asking is this: Why can't articles / blog posts / web pages be evaluated on their individual merits? Does a small amount or press release content or other reprinted work condemn you to low rankings across your entire site? And what does all of this mean to the future of web publishing?