Intel new CPU/GPU vs Other

Discussion in 'PC Hardware' started by Oery, Mar 8, 2011.

  1. Oery

    Oery MDL Junior Member

    Nov 14, 2010
    71
    15
    0
    My apologize if this post is likely out of topic.........
    but, this forum fill with many "qualified" enthusiast --- so......
    :)

    My question in simple:
    Intel recent processor almost ALL have an integrated graphic processor (GPU)
    so, if we use other GPU such as ATI/NVidia, Intel GPU will run to waste, isn't it ?
    Even consume power for nothing ?

    Thank you.
     
  2. alextheg

    alextheg MDL Expert

    Jan 7, 2009
    1,776
    812
    60
    If you are using an add in gpu like Nvidia or ATI then onboard graphics will be disabled an consume nothing.
     
    Stop hovering to collapse... Click to collapse... Hover to expand... Click to expand...
  3. burfadel

    burfadel MDL EXE>MSP/CAB

    Aug 19, 2009
    2,627
    3,856
    90
    It is a bit of a waste, considering you are paying for something you can't use. Having it on the mainstream CPU's, and CPU's they intended targetting medium to high end PC users, was very pointless as they almost all have separate GPU's. The chip is ok for those that word process etc, however you probably don't need a medium/high end CPU for that!

    Therefore, on the i3 it makes sense, but i5/i7 etc, it doesn't. If the GPU could supplement the addon GPU, then thats a different story all together!
     
  4. Oery

    Oery MDL Junior Member

    Nov 14, 2010
    71
    15
    0
    Same thought :)
    Giant Intel, push everyone to buy "CRAP" GPU, which outperform by old ATI/NVidia !
    latest i5/i7 xxxxx-series (Lithography: 32nm), all, GPU integrated, even with K-code..... :(
     
  5. frwil

    frwil MDL Addicted

    Sep 22, 2008
    541
    195
    30
    Well there are a few options there to ease accepting new reality:
    1. consider they're giving integrated GPU for free... after all who said that if they had released some new-gen CPU's w/out integrated GPU, they'd be cheaper?
    2. i don't really follow what will be next, but wouldn't actually high-end processors of this generation be released only at the end of this year and they won't have integrated graphics?
    3. Not really important, but this integrated GPU isn't actually that bad. Depends what you'd compare it with, of course. If you don't really need DirectX 11 support, or high-end gaming experience, it will seem OK. Intel made a big progress here, compared to graphics in first-generation of i7/i5/i3, and it's not fair to say it's useful only for office anymore. Consider it to be just another step in integration progress, and eventually separate graphic cards will remain exotic... i mean nobody minds that all mobo's come with integrated sound chips anymore? Just got used to it.
     
  6. Oery

    Oery MDL Junior Member

    Nov 14, 2010
    71
    15
    0
    #6 Oery, Mar 8, 2011
    Last edited: Mar 8, 2011
    (OP)
    Look.... Intel release P67 chipset for enthusiast, which not activate internal GPU, but all CPU came with it - have GPU within...... that is ridiculous.
    I just hope that ATI/NVidia will work something, maybe somekind of cross-platform SLI/crossfire, which can use Intel GPU as support processing.....
    Mobo with soundchip, can be used side-by-side with sound-card, if we want it.
    and, believe me, there is nothing come free from Intel :)
     
  7. alextheg

    alextheg MDL Expert

    Jan 7, 2009
    1,776
    812
    60
    On the whole I think Intel are doing something good. They are supplying chipsets that do everything...... pretty much straight out of the box. Of course some of us techies or enthusiasts have special requirements, especially high end gamers.

    It all comes down to budget, you just spent £800 -1000 on an i5 or i7 system. Do you have the money to spend £ 150 or more on a high end gpu ? If not you are getting a good deal out of the box.

    Integrated hd audio and video..... not that bad.
     
    Stop hovering to collapse... Click to collapse... Hover to expand... Click to expand...
  8. burfadel

    burfadel MDL EXE>MSP/CAB

    Aug 19, 2009
    2,627
    3,856
    90
    #8 burfadel, Mar 8, 2011
    Last edited: Mar 8, 2011
    High end gamers are the ones most likely to buy the i7, everyone else would buy i3, i5. Makes little sense paying for something you cant use! Even a budget gamer wouldn't waste their time with the on-CPU graphics with an i7, they'd more likely go an i5 with a HD5770/HD6770 for example.

    It really depends on what you want to do with the computer. Unless the work is purely computational, then a video card would be beneficial! People buy an i5 and especially an i7, for gaming, video editing/graphic design, and any other task that requires good graphical power. Its all about balance. If you want a good gaming computer, you wouldn't spend all the money on a new i7 CPU and use its internal graphics, simply because it would suck! Even the computational argument loses merit when you consider the progress is towards OpenCL/Directcompute etc.

    Realistically, who would really buy whats considered a performance CPU, the i7, then cripple what you probably intend to do with it (gaming, CAD etc) by using the onboard graphics? If people buy the i7 for blu-ray watching or general HTPC use, they are really misinformed - its overkill. The onboard graphics is not free, you just don't pay a separate price for it! If on-CPU graphics were developed for desktop/word etc use and video playback, and have more intense video tasks offloaded to a separate video card (without duplicating features) for gaming, that would be the best of both worlds. This would require a change in the video cards, but they would become cheaper as they wouldn't need to incorporate the features the CPU has (the video decoder stuff etc), and software/hardware to incorporate the two together. Using a standardised platform, you could have an Intel CPU or AMD CPU, mixed with a Nvidia card or AMD (ATI) card.
     
  9. Oery

    Oery MDL Junior Member

    Nov 14, 2010
    71
    15
    0
    But, crap for 3D Games.......
    do you ever compare i5/i7-xxxx 3D performance ?
    nothing came close to....., let say.........., old-but reliable GTX-260 (yes, its only DX10)
    GTX-260 have (far.......) better fps on every recent 3D games.........
     
  10. regal

    regal MDL Member

    Aug 26, 2009
    153
    6
    10

    Actually all the SandyBridge are for mainstream, the enthusiast socket is still the 1366 until the socket 2011 is released this fall.

    I bought a Sandy Bridge motherboard and am glad to have the integrated GPU, I only need it to watch movies on the TV, I could care less about PC Games as do 99.999% of all Sandy Bridge buyers. Saves me around 50W of power having the integrated gpu.

    Until the PC Gaming scene makes any sense folks will be using their Xbox or PS3 for gaming, eventually Intell will have an xbox on thier cpu's along with the most important feature that the PC gaming industry seems to have totally overlooked: universal arcade quality game controllers.
     
  11. burfadel

    burfadel MDL EXE>MSP/CAB

    Aug 19, 2009
    2,627
    3,856
    90
    #11 burfadel, Mar 9, 2011
    Last edited: Mar 9, 2011
    If all you're doing is watching movies on TV, you don't really need an i7 or even an i5..., and that comes back to my previous point. An i7 isn't really a mainstream CPU, its considered as a performance part.

    If your doing menial tasks and spending money on any i7, its actually pointless. Much better to save the money now, and keep saving that for a later upgrade. I know people say 'future proof', but there is really no such thing with computers.

    Some valid reasons for buying an i7:
    - Gaming (requires a reasonable GPU to make the i7 worth it, say a $230 or so for a good OC HD6850 Radeon)
    - 3D modelling, CAD, any graphics demanding application (Requires a decent GPU)
    - Directcompute OpenGL related computational tasks (Requires a decent GPU)
    - Several other reasons I'm sure people could come up with! (almost all which may require a good GPU)!

    Onboard graphics do have a place, but why put it on a product that is intended for performance computing? On board graphics is really only suitable for tasks which don't require the fastest CPU's (yes that does make sense if you think of it), apart from video or audio encoding. For those buying an i7 for those tasks, they could simply add on a cheap PCI-E card. The fact is, the majority of i7 buyers, probably the 99.999% you mentioned, do actually need something a little better than what the CPU comes with! The fact that you do actually have to pay for the GPU as part of the CPU price for an i7 is ridiculous. It will mean the AMD Bulldozer CPU will be a lot more competitively priced against it though :)
     
  12. acyuta

    acyuta MDL Expert

    Mar 8, 2010
    1,712
    397
    60
    Nominally yes the enthusiast socket may be the i7 9xxx, but the Sandybridge has perhaps killed off the i7 9xx/x58 chipset. No sense at all for most in buying i7 entry/mid series, and i7 99../extreme series now seem like a total waste of money for almost all. They could be substantially thrashed come late-2011. For me however, perhaps going from i7 to sandybridge's successor may make more sense.
     
    Stop hovering to collapse... Click to collapse... Hover to expand... Click to expand...
  13. Oery

    Oery MDL Junior Member

    Nov 14, 2010
    71
    15
    0
    You don't have to build system to watch movies on the TV, use Notebook or Barebone PC with Sandy Bridge processor in it.... they doing fine, and will save power a lot.......
    Intel mix-up they plan for i5/i7 with M (mobile) series......
    Budget enthusiast (let say use just SLI.....not 3-way, 4-way SLI), will select Pxx chipset, because Xxx chipset is overkill and pricey......
    I just questioning Intel intention and marketing ethics to wipe-out non GPU integrated CPU.
     
  14. akf

    akf MDL Senior Member

    Aug 17, 2010
    345
    152
    10
    #14 akf, Mar 24, 2011
    Last edited: Mar 24, 2011
    Perhaps, upon boot up into Desktop, only integrated GPU of Core i7 is switched up and used to power the Aero effect, leaving the dedicated GPU switched off. Only when users start to play games or execute other GPU-intensive applications, onboard GPU is disabled and the dedicated GPU is activated to provide maximum graphic performance. Yeah, it is just my imagination, but such example like this may justify the existence of integrated GPU in Core i7, which can save electricity as well.

    If the onboard GPU must be disabled and left idle with the existence of dedicated GPU, then what is the point of having integrated GPU inside Core i7? The onboard GPU core wastes valuable space in the processor. For that, I would rather subsitute that GPU core with another CPU core. I know that the present application cannot fully utilize hexacores and octacores (which may come default in Haswell processor microarchitecture), but for hardcore gamers, the chances of utilizing additional CPU cores is much higher than of using onboard GPU core.

    For that, Intel should have released a high-end enthusiast version of Core i7 processor that does not contain any onboard GPU core at all, if there is no way that the onboard GPU could supplement the addon GPU.
     
  15. van.deepre@gmail.com

    Mar 21, 2011
    19
    0
    0
    i think intel's SandyBridge is very good because you get two things at lower price.
     
  16. Drerex

    Drerex MDL Novice

    Feb 9, 2011
    12
    3
    0
    I just use nvidia so that chip would be waste of a gpu for me. I am a x58 man so there was no waste in my case anyways.
     
    Stop hovering to collapse... Click to collapse... Hover to expand... Click to expand...