Today a lot of notebooks have 2 GPUs, 1 for the low performance jobs and 1 for the tasks that require more performance. As far as I am aware in 2 of the 3 cases silicon and money is wasted... There are 3 possible cases, and as far as I am aware in 2 of the 3 cases silicon, money and performance is wasted... CASE 1: AMD APU + dedicated ATI card With this setup there aren't much problems. When doing low performance tasks the ATI is off, the APU GPU does the GPU work and the APU CPU does the CPU work, and everyone is happy. When more graphical performance is required the ATI goes on, and cause the ATI and APU GPU can work together in crossfire the max graphical resources is slightly less than APU GPU + ATI. So there are 2 cases: Pretty idle: The max available CPU resources are the max that the APU can deliver, the max available GPU resources are the max that the APU can deliver. Pretty heavy load: The max available CPU resources are still the max that the APU can deliver, but the max GPU resources are about the max that the APU + the ATI can deliver. So in the full load case all the silicon is working and nothing is wasted. One could discuss about the fact that AMD could have used the silicon and space of the APU GPU to make more x86 cores, but that would reduce the GPU performance at the same time. The conclusion is that the engineers at AMD has done their work pretty well and all is leveled out quite good. CASE 2: Intel processor graphics + any dedicated graphics card This is where they start to mess things up. Again there are 2 cases: Pretty idle: Here it's the same situation as in the idle case of the AMD setup, the dedicated card is off, and the core i GPU does the work. So the max available CPU resources is the max that the CPU can deliver, and the max available GPU resources is the max that the core i GPU can deliver. Pretty heavy load: This is where it goes wrong according to me. As far as I know the core i GPUs don't support crossfire, and don't support SLI either. So the only thing that can happen when your beloved NVIDIA or ATI beast starts it's engines... is that the core i GPU is turned off. Now the max available GPU resources is the max that the NVIDIA or ATI can deliver, and the max CPU resources are the max that the CPU can deliver. Not that in ths situation the core i GPU is doeng nothing at all, and it's just a piece of wasted silicon and your money (you paid for it) at the moment. Since the GPU resources won't ever become higher than the max of the dedicated card, couldn't intel remove the GPU from the core i and add more x86 cores? The difference with the AMD setup is that here it doesn't reduce the graphics power. So in every load situation the max available CPU resources would be higher since there are more cores, and the max available cpu resources stay the same. The only thing that's required her is a graphics card that underclocks itself when there's not much, not just a little bit but severely: - Turn of part of the memory. - Underclock the memory that's not turned off. - Undeclock the GPU. - If it's possible, turn of individual streaming processors, cuda cores, shader units; etc. This underclock would be an answer to the increased energy usage when the main graphics card is always on. The CPU cores sould also be underclocked or turned off, but I think Intel does already turn off cures if I'm not mistaken. CASE 3: AMD APU + dedicated NVIDIA card Cause the AMD apu doesn't support SLI and the NVIDIA doesn't support crossfire I think this is the same situation as with the intel GPUs. But to my means it's stupid to combine an NVIDIA card and an AMD APU in 1 system, one could better use an ATI in these machines. So the big question: why do OEMs build systems with intel GPUs + a dedicated card? Of course NVIDIA and AMD need to add advanced underclock featurs to their cards first, but I don't think that's to hard. And intel must make mobile processors without GPU. I think processors with more x86 cores instead of a GPU would actually be cheaper to develop and manufacture, cause it's less complex.