Intel Sued Over Core 2 Duo Patent Infringement 216
An anonymous reader writes "It looks like Intel is being sued over a patent infringement alleged to be in the Core 2 Duo microprocessor design. 'The Wisconsin Alumni Research Foundation (WARF) is charging Intel Corporation with patent infringement of a University of Wisconsin-Madison invention that significantly improves the efficiency and speed of computer processing. The foundation's complaint identifies the Intel CoreTM 2 Duo microarchitecture as infringing WARF's United States Patent No. 5,781,752, entitled "Table Based Data Speculation Circuit for Parallel Processing Computer." WARF contacted Intel in 2001, and made repeated attempts, including meeting face-to-face with company representatives, to offer legal licensing opportunities for the technology.' The text of the complaint [PDF] is also available via WARF's site."
Re:Universities Are Good (Sometimes) (Score:3, Interesting)
Just because it's not a troll doesn't mean it's a good patent. It may be that the solution is obvious to one "skilled in the art" even though no one seriously considered the problem before. Just because the university thought of it first doesn't mean it's a good patent.
Of course, I haven't looked at the details of the patent or the case. It may well be a blatant attempt by Intel to rip off a clever idea from the university. My guess is that reality is somewhere in between...
Re:Universities Are Good (Sometimes) (Score:3, Interesting)
As someone who once was a collegiate instructor and employee, I can say for certain that no self-respecting board-of-regents-member college would even think of lowering tuition, for any reason. Scholarships, sure... as long as the money comes out of somebody else's wallet. Student financial aid? Again, they love it - but on the same premise as Scholarships. Work-Study programs? Okay, but it's the equivalent of getting offshore-priced labor on their part.
No, my friend... no way in Hell you'd ever see a Uni drop tuition pricing in response to a flush of patent-money. They'd rather spend the dough on a new football stadium, or on perks for the tenured and administration.
The one and only condition that would see a tuition price drop is in response to a drop in head-count, or in response to any new competition (e.g. University of Phoenix opening a new campus in the same town or city...)
Re:Universities Are Good (Sometimes) (Score:5, Interesting)
Re:Now that you mention it... (Score:4, Interesting)
Re:Universities Are Good (Sometimes) (Score:3, Interesting)
Re:Universities Are Good (Sometimes) (Score:3, Interesting)
Well, yeah, the way this is normally discussed, the conversation usually turns towards someones irrational communist leanings about how "some people" should pay for "everybody else" to go to college. Nevermind that not everyone ought to go to college at all, much less because others were forced at gunpoint to pay for it.
But that's not what I'm replying to you for..
This is actually brilliant, and is quite the opposite of communism. And infact this may very well be the answer to the problem you're talking about in the first part of your post.
Universities currently get a fair bit of financing from private donors, from athletics, and from taxes. So much of tuition is subsidized for one reason or another that lots of people go that perhaps ought not to. There's a demand glut, so to speak, and so there is little incentive for a university to do anything other than expand and raise prices.
A very nice thing about what you suggest is that that investment revenue can be re-invested by the university, for the university, to fund scholarships for promising students. Top flight schools like MIT effectively have this arrangement -- if you get into MIT (because of your qualifications), you WILL be able to afford it, because a variety of interested university backers will see that the money appears.
Generalizing this a bit, a university with disposable income from the past results of its research may have an incentive to recruit promising new talent and thus subsidize the education of the best and brightest minds.
And all of this would be done without coercion by the state. Different universities would have entirely different criteria for who they beleive is promising, and how they think their scholarship money ought to be spent.
This is part of a solution of how people go to college without making society at large pay for just anybody to go for any reason. Investors will choose which universities to invest in, based on selection criteria, past performance, etc, and that will tend to cause universities to spend their money a bit more wisely.
There are a variety of other privatized education funding models discussed by Friedman, etc, but this one is one I've not thought about before. Thanks for mentioning it.
Re:Now that you mention it... (Score:3, Interesting)
Second, in this case, WARF actually contributes significantly to both the University of Wisconsin Madison (a state school) and to the inventors, inventing laboratory and inventing department. You can read about the process here: http://warf.ws/inventors/index.jsp?cid=14&scid=40 [warf.ws]
Third, WARF has been at this for a very long time. They're a very sophisticated patenting and licensing entity. They have definitely thought this through. According to their website, they are also not in the business of patenting every idea that every professor discloses to them (they say 60% of disclosures, but who knows).
Furthermore, they're a great asset to those inventors without the means to pursue licensing and patent protection on their own. Inventors pay nothing up front for what is otherwise a very expensive, time consuming ordeal.
Re:I happen to work in WARF (Score:3, Interesting)
Re:Not a Troll then? (Score:3, Interesting)
And let's not forget it's extremely expensive to file lawsuits. If anything it's in the best interest of the patent holder to come to some agreement rather than go to court. Sure they'll get their court costs back if they win, but who can afford to drop tens of thousands of dollars over the course of a few years?
Trust me, unless they're incredibly wealthy no one wants to go to court, especially against someone with $$$$. I've filed lawsuits and even when what they did is obviously illegal and I'll easily win in court I still have to decide if it's worth paying thousands of dollars in attorney fees and court costs over the course of a few months just to get several thousand back in the end. They really need to up small claims to $10,000+ because everything seems to be over 3 grand now days.
Re:Happened to Sony and IBM also (Score:3, Interesting)
From http://www.warf.org/inventors/index.jsp?cid=7 [warf.org]
After deducting this portion, a certain percentage goes to their operating costs - I'm sure keeping a number of patent lawyers around isn't cheap. The good thing is that 20% is BEFORE those costs. The rest goes into a grant given to the university, distributed as such.
Whether or not shady accounting occurs in these settlements and grants I have no idea, but I have no reason to believe so. As I said before, everyone I know who holds a patent through WARF has been quite happy with the arrangement.
At UW, if no federal funding was involved, the intellectual property generated from research is the researcher's (unless there were strings attached to the private funding) - they don't HAVE to deal with WARF at all. I know this for a fact from my dealings with them - my collegues and I chose to work with them because the benefits of going through WARF FAR outweighed the the cons.
Sure, most settlements are confidential to the public, but not to the patent holder (who would include the inventor/researcher along with WARF).
From your post:
I appreciate your skepticism, and can understand where you're coming from except for this comment - you have no clue what I do or do not know - don't pretend you do for dramatic effect. I did my research on WARF when my personal interests were on the line, and from what I was able to discern it's a GOOD deal for the researchers/inventors, the University, and the student body.
Re:Not a Troll then? (Score:3, Interesting)
The memory disambiguation table is a variant on a branch predictor (I'm not going to give the exact Intel algorithm). It's obvious. The only reason no one has done it before is that the benefits didn't outweigh the implementation (and especially, validation) costs. Core 2 is a big enough machine that it's worthwhile.
Here is how I understand what Intel does from the publicly available description: The processor often encounters load instructions, where the memory in question may or may not have been modified by a previous write instruction. (The "may or may not" case happens when a previous store instruction has not finished calculating its address yet. The case that the processor _knows_ the data has been modified is something entirely different). This situation happens quite often, and quite often the store instruction did _not_ modify the data at the load address.
In this situation, the processor cannot just execute the instruction. It can use out-of-order execution, delaying the load instruction, but out-of-order execution has its hands full with _real_ dependencies and it would help if it didn't have to bother with possible dependencies that don't actually happen. Therefore Intel allows the processor to continue with conditional execution.
The text of the patent covers a lot about how to recover from wrong speculative execution, but this is no problem at all. All the hardware for that is available already because of branch prediction. With branch prediction, the processor has to keep track of speculatively executed instructions, undo them if necessary, and continue execution at a different program counter (in case of a branch through a jump table where the destination is predicted incorrectly, restart execution at the jump instruction. In case of a conditional branch predicted to be taken incorrectly, continue after the branch instruction. In this case, continue with the load instruction). So half of the patent isn't infringed upon at all.
The patent also seems to suggest that one should look at the store instructions to make a decision whether to speculatively execute the load or to wait. It doesn't look to me like Intel is doing this. It seems that load instructions that are ready to execute except for store/load hazards are classified as: Safe to execute, known not safe, unknown. "Safe to execute" executes. "Known not safe" doesn't execute. "Not known" can use mostly the same mechanism as branch prediction. I would probably try to actually use the same hardware as for branch prediction. That would trivially allow multi-level approaches (for example, loading a [i] can execute if a previous branch around a statement "if (...) a [i] = x" was executed, and mustn't execute if the branch was not executed. Branch prediction does that kind of thing already, so that would come for free.
The only thing that has to be changed is the mechanism that marks a prediction as correct or false. Usually that is a compare instruction setting some condition codes, here it is an instruction calculating the address of a store instruction.
Now I cannot decide if what they have patented is obvious or not. However, the mechanism that I have described here _is_ obvious as shown by the fact that I, a programmer and not a hardware designer, and therefore not an expert in the field, can describe it. If what Intel does is what I have described then what they do is obvious.
I'll mention a different invention that is just a tiny invention, obvious _after_ you read it and patented (but not obvious _before_ you read it): Branch prediction predicts whether a branch is taken or not. It uses tables, and tables run out of space. When that happens, a branch will use a prediction that was actually intended for a different branch instruction - with not very good results. To improve this, you do a (trivial) static prediction. Then you don't store in your tables whether the branch is taken or not, you store whether it matches the static prediction or not. Quite often (80%