Intels Graphics Cards Could be BEAST Mode!

July 28, 2019 posted by

Oh My gods Intel is releasing well sorry Intel is going to release I discrete GPUs next year or the year after and there's been a leak about it and I want to discuss it because if it's true it's kind of interesting now I will never be one of those people who just reads off WCF tech articles and talks about the news I always try and give you my own spin on it my own take on it so when I'm talking about leaks and rumors I try and I try and infer some things from previous generations performance so what we know about all of the previous generations of Intel's performance is they have about eight shader units which I think they call them a I'll use I'm not sure what they call them in in in Intel's graphics and then they their compute unit is an execution unit and the execution unit has around eight in every execution unit so execution unit for Intel compute unit for AMD SM for NVIDIA so if I use those terms in this video from now on SM means Nvidia and you know Cu or compute unit means AMD an execution unit means Intel and I'll probably mix them up and say one for the other enjoy live roller but anyway so the execution unit has eight and the leak is the rumour is that one of Intel's newest generations of GPU that will come out at discrete GPU that we will be seeing possibly next year or the year after will have 512 execution units now that looks frightening because you look at AMD's graphics cards Nico well they've only got like SWA they've got how many compute units is in Vegas 60 how many SMS is in touring about 72 so you start going how the hell are intel going to add that many compute units or execution units what the fuck's going on but then you look at it and you go well it's only eight like you know shaders per a you know execution unit now that might change that number might change within the future we don't know but assuming everything stays the same because I'm imagining all they're gonna do is get is like scale-up the graphics that's on ice Lake is that or whatever the ten nanometer stuff is called their own scale out opus it discrete GPU and if they're gonna scale that up to discrete GPU we still know it's on its it's 32 execution units for a grand total of eight per 32 so that's what 256 or something al use or more shaders whatever so 256 shaders gives you roughly the performance of Vega 11 I think they got Levin slightly faster still it all depends on memory speeds and all that kind of stuff I would argue that Vega is still a little bit more powerful but Vega has 760 something shaders that is a lot of shaders so the fact that a 768 shaders on the on the vague 11 part and is only 256 on the you know the Intel new Intel integrated graphics I could tell you how well it now everything changes when you're scaling up a lot of times you don't get linear progression upwards so your performance doesn't go up linearly it usually you take a hiss and you know and what happens is as you get to bigger numbers you start to lose performance but we don't know the Intel's performance so what we're gonna talk about what we know and what we know is that Intel tends to perform better with lower amounts of shaders than AMD does and we know that Intel are what I would consider money-grabbing feckers and will try and steal the the market from everyone because they want to control the market now on the market very much like in video in video name the any Intel are kind of the same I would say in that well at least in the Intel doesn't tend to lie UT as much as in AMD doors but Intel are are no strangers to do and sneaky underhanded tactics so we have to be careful but at the same time and we've seen that the the leaked benchmarks of what are not leaked been factored by actual bench who actually talked about with the previous like the current gen what's gonna be coming out in laptops gen 9 graphics or whatever is a gen 9 or gen 10 I can't remember I should know whatever you should do my research before I do my videos but anyway that Dad graphics is roughly an equity they quit you know they compared it to I think Vega 11 was a very good leaven or Vega 10 leave comparing one of Vegas and the fact that those Vegas is they have way more shader so usually they so they had about half or a third of the amount of shaders as the Vega part had and they were in and around the same performance so if you look at something that would have em I would suppose 512 execution units that's about 4096 shaders that's going to be massive and that tells you that a hint L are going after the high-end SEC second segment of the market if graphics kind of if got they don't lose a lot by going bigger and if they don't lose a lot of performance gone bigger well that tells you that they're going after the higher end of the market because 512 there's a huge news I could say 512 execution units you go that sounds like a lot but I don't you could think there were shaders you could think the money but they're they're more equivalent to compute units from AMD or Nvidia I think they cluster them as well so that might be a more the more equivalent to that but yeah so that's essentially what happens and because that happens that way you look at her 4096 4096 a la like Vega has 4096 and as I said before if you compare Vegas performance to Intel's performance its beating it with less shaders now as I said we could talk about you know memory and how that those benchmarks that Intel gave or a bit skewed towards Intel because Intel gave you they showed you and you know memory they were using faster memory and I always say the biggest bottleneck on an APU is not the amount of shaders it has but the kind of memory its using so if you're using really fast memory that would give you better performance dead adding more shaders dramatically more the biggest bottleneck for all ap use is the graphics memory and it's because it uses the system memory which is a lot slower than graphics memory gddr5 ddr4 memory is actually a lot slower than gddr5 now how much flour on the magnitude of like Jesus 2 to 3 times slow as GD r 5 virtually our 6rh p.m. on current cheap use so that's the big bottleneck there but when you look at it Intel's graphic so imagine they're gonna put GT or six on their dedicated discreet GPUs when you look at what they're gonna be doing there we go are you always taught that they were going after the h EDT market you know like are that the the server market or the you know data center market I think they're gonna go after them but obviously when you're going after them you're going to have reject silicon that you can sell to gamers you know settings gamers is a good pipe cleaner forgetting your stuff ready for going up to you know the big big boys and it would be interesting to have a third player in the market come in and maybe have even or X 5700 XT level performance right for like three hundred quid and then everybody's like what the fuck's going on here we're all gonna have to cook like that's it's only good for us having in order to a third party in the market where it becomes problematic is when one of those those parties in the market excels above everyone else and the other two are just fighting amongst themselves well height ends up and then it's do door to one of the other two will die off when one of the other two dies off then you've got a duopoly again I don't want to do up now I want multiple companies competing for segments of the market and that's really good for us as consumers it can only be a good thing there is nothing bad about Intel making GPUs nothing there is no bad all you always would savor Intel making GPUs would be very careful because if they become market leader they'll just do what they did to the CPU market for god knows how long that's their modus operandi is to just kind of you know chill out with the leadership and not really shoot off into stratosphere in video shoots off into stratosphere but just gives you kind of the crop while they're selling the big stuff to the data center and you know whereas in AMD I think of all of the three companies probably a little bit of both it's it's somewhere in the middle so that's that's something to think about but it is exciting to see that there is a toward party enter new market and they're not gonna be coming in with this entry level shitty GPU that they're going after the big boy the big boy graphics performance the big boy bad you know the crown even before we're 4096 shaders if this is true if this is Acuras then I can't remember where I've seen this and you're gonna have to forgive me I do this all the time but I did CMW CCF tech as well but I didn't read there first so I don't know who made the original article so don't who to credit but I'm gonna say WCF tech have it if you want to read the article about the the 5000 the 512 execution units if you want to read it there but other than that I don't know where I see the first I seen the first this morning I am and then I got up and read it so yeah if that is true that is exciting and that means that Intel are not going after I thought Intel would enter the market right genuinely so I thought I taught Intel with it into the market with like an or X 480 kind of card you know like 200 quid you know probably a biggish die for a 200 quid card but that's pretty much all they could do and as the generations went on they would improve and iterate on that till eventually they get priority in the GPU market but if they're coming out with a GPU that's 4096 stream processors well I came across whatever they are a I'll use or whatever if this 4096 you know shaders on that die that's going after the high end like AMD big boy now it's or x57 order actually that has 2560 Nvidia's big boy has 4300 so 4096 he's going after the big boy market he's going after the you know I have my big boy pants on and I'm making GPUs now and that's what that says and I think that's what Intel are doing they're making a statement they're gonna make it a genuine statement or it could be a disco could be blowin smoke VRS into 512 and you know execution unit GPU could be made up and that would be absolutely the point or it could be real but it could be just going into the data center and they're gonna market it at the day of center cause remember Intel's core business it's not selling CPUs to you the desktop that's not what their market the core business is providing solutions for gigantic companies that's what their core businesses and they've gotten so big that usually what happens is and big companies say to Intel well we'd like a server with this many cores in this much well we don't want to do this and we wanted to be able to do that an Intel custom go yeah will sort that solution over you I asked what they do that's their core business and some could argue they got a little bit lost some could argue that they lost focus but that's what they do and they make billions and billions of dollars every year providing solutions for companies rather than sort of providing solutions for you to consumer you to consumer get the drinks you to consumer you know get the smaller pieces the bits that don't really matter anymore that's what you get so when you look at that yeah they could absolutely be doing that with a GPU market they could absolutely be building GPUs for a data center or building GPU so that their clients don't have to go and build out a full Intel based CPU kind of you know super computer or server or whatever with millions of Intel cores and then go where are we gonna source our GPUs from the source of AMD or AMD could be owned well look if you go when you take you take five thousand epic CPUs office will throw in five thousand you know Degas GPUs for free or for Half Price and aimed you're offering that solution but Intel aren't so if Intel could offer a GPU solution with that as well I would imagine that that's where the money is for Intel that's where the money but all but all that means is that those big boy toys dies some of them won't make the call and we'll get them as game or isn't that good for us so we'll see and thing is that people seem to not I think a lot of people play down the gaming market the gaming market is huge like the amount of money to be made and even the discrete GPU look at invidious bottom line look at him Nvidia makes billions of dollars a quarter from gamers that's a big market I don't think anybody should be turn in a nose above that in fairness to CPU mark is much larger but in like the GPU market is growing it is growing the discrete GPU mark for gamers is growing and all you think that while sales are falling in the desktop sales for gamers go nope so like I got a I got a my friend the other day little brother he's just build one Sybilla PC and it's always consoling you said I don't want to PC I want PC they're all talk are the kids kids now all they talk about gaming on PC and you know my nephew and stuff all of them they just talk okay no if he swears years ago that's gaming console that's what you've talked about look even two or three or four years ago you were talking so it's all changing so it's interesting to see this I'm happy that there's another player gonna enter the GP market there's no negative on this other then the only negative I could find is if one company gets so big it blows out of the you know the GP race it's so ahead of everything else that unfortunately kills another one of those and then they leave the GP market but no catotti and there's also you know rumors flowing around and watch I won't say anything else but there's another part you gotta enter the GPU market go watch broken silicon podcast by Moore's law is dead and cortex is on it Celso from cortex is on it go watch it it's it's a very very very very interesting and you'll enjoy it and yeah there's a little nugget of truth at the end it's quite a quite quite good wood so anyway I'll talk to you later don't forget to Like comment share subscribe like well actually I said like it if you liked it dislike leave this like to put if you disliked my dislikes gonna fix it but not know what it wrong and in comment something know what you think and do you take that it's good that it telegraphic GPUs are you excited about looks like they're actually gonna make big GPUs rather than small GPUs cuz that's genuinely all I thought they were gonna do they're gonna make small GPUs I didn't really compete you know and then eventually they would do it scale but it looks like know they're gonna be building big-boy GPUs yeah and yeah I have a patreon if you like to help me buy hardware for the channel it's completely up to you yeah I don't try and lock much of it behind paywalls but I'm planning on trying to do a QA or something like that so as long as I get I get enough cookin comments like enough questions I will do that and the only way to do that is either semi a message on discord or to send me a message on patreon if you do that I'll probably prioritize to patreon members before I'll put prioritize anybody else just to be fair to them and I'll talk to you the next one and I don't forget to like comment share subscribe all sharing is most important stop recording


49 Replies to “Intels Graphics Cards Could be BEAST Mode!”

  1. Seizon Sha says:

    They have no choice but beast mode and have a good price. Competition is good for us.

  2. Andrew Aiken says:


  3. Novavolex says:

    3th player is always good. I'm not thinking that thay will be on the highend level of AMD or NVIDIA (at least in the first several years), but for sure thay will shoot for the mid range (where the big money is). Thay have the man power for a very good drivers so I think that Intel will start strong.

  4. Eva Langley says:

    On 14nm+++… CMON!!!!!

  5. bobiseverywhere says:

    4096 Shaders and 500MHz lol so just right at the bottom of the barrel. I am kidding, also super excited to have a third party in the market again.

  6. The Fin says:

    Will be an exciting time when Intel enters the consumer GPU market , cannot wait .

  7. john bumptybump says:

    I am happy about intel big GPU but we knew they were going to anyway. They want the GPU compute market. Still hoping for a small small gpu alongside it, i was hoping for something like one of their iGPUs on an extremely low power card. It would give us more options.

  8. Ron Paynter says:

    Using Intel's integrated graphics have been an option for ages and ages now. They are already the 3rd gpu option on the market.
    Just because they have always sucked a bit compared to Nvidia and AMD doesn't really change that.

  9. kevin ham says:

    actually your wrong, there is no chance of getting out of the gpu market considering they are and have been making the consoles, so they arent going anywhere. nvidia definitely isnt going anywhere, we all know that aswell. intel is the only 1 of 3 they could drop out of the gpu game, and thatll only be if their gpus completely flop. and it would have to be a fucking disaster for them to just call it quits after all the money they spent and all the people they hired.

  10. Diksaca Yehovah says:

    Fuk Intel.

  11. kevin ham says:

    i think intel is going to retire the 2080ti at the same price or less

  12. sesom07 says:

    You forget something important. You the consumer should rent the game stream that´s made in the datacenter. It´s completly fine that you own the crappy GPU as long as it can show the stream. That´s what this stuff is about.

  13. Varoom Zazoom says:

    Why aren't there any beefy APU's on a card for isolated in system virtualization work?

  14. Open Mind says:

    Intel has seen how the GPU market is dominated by Nvidia .Amd with its low R&D budget can't compete with Nvidia in GPU department.So the Biggest Giant in business has decided its time to take down Nvidia not be the second (as intel never tasted it before) .I think Intel's target would be Nvidia ,not Amd .So those 4096 units most probably is to compete with Nvidia's 1080Ti cards both in performance & power consumption /temperature

  15. Open Mind says:

    Intel's GPU will be sold like Hotcake ,Performance or not GPU enthusiast will get one to keep one at any cost

  16. Michael Garcia says:

    Does anyone here has a Radeon 7? Do you guys has a lumion software? i'm planning to build my workstation more on 3d rendering and i'm choosing between rtx 2070super with 8gb vram vs radeon 7 16gb hbm2. i desperately needed your response guys, hope you can help me. thank you!

  17. xyz360400 says:

    Wouldn't it be something if Intel started making the best GPUs and ended up bailing themselves out of the troubles they've been in due to CPU issues by switching their business focus to GPUs and in 5~10 years we end up with AMD as the main CPU vendor, Intel as one of the main GPU vendors (while likely making APUs and laptop chips on the side; something they're still pretty decent at) and we're left with 3 competitive GPU vendors while playing 'counting cores' all day with AMD constantly pushing up the core count on the desktop. Definitely fascinating. Still, I have to hope that AMD eventually really overtakes the IPC of Intel's late gen chips (right now they're about even in most areas and just slightly behind in a few, so what I seek is real progress beyond that level on top of all the cores they give us as that would be amazing; imagine a 64 core consumer desktop CPU with 2x the IPC of Coffee Lake/Kaby Lake etc.).

  18. Micheal Xlr says:

    Paul: You should do my research before I do my videos.

    Me: OK.

  19. DarthRaver86 86 says:

    I never get notifications for your videos. EVER.

  20. Jim R says:

    Money grubbing feckers… LOL

  21. bobhumplick says:

    if the intel grpahics has a really big cache then that might have influence the benchmarks as well.

  22. bobhumplick says:

    yes scaling is a big part of why an intel igpu with less ALU's performs at or near the vega 11 part. if intel scaled up to that same size there wouldnt be that much of a difference. but i still think it would be a pretty good difference. intel has been making their graphics based on system memory so far and things like tile based rendering and cache structures that amd havent looked into might beneift intel. the real problem for intel for ahwile will be drivers i think. but they dont have the lack of software people that was killing amd so they will fix that pretty soon after launch i think

  23. bobhumplick says:

    intels EU's are not exaclty the same as a CU. what intel calls a subslice is more like a CU while a slice is more like one shader engine on an amd gpu. for instance a subslice has 8 EU's and each EU has 8 ALU's so thats 64 shaders per subslice, just like amd and turing.

    the 512 EU gpu looks very similar to a vega 64 with 4096 fp 32 units and it has doulbe rate fp16. if they dont geometry bottleneck it then it could be a good card. im worried that they may have similar problems to gcns 4 wide wave front compared to rdna's 2 but as long as the geometry performance doesnt limit it then it still might be a good card. and intel press releases do mention a form of time based rendering. tiled based rendering is part of the reason nvidia cards arent as memory limited and why the efficiency is pretty good as well. i expect these cards to be a bit compute heavy but not as bad as vega when it comes to graphics (per core i mean). i hope anyway

  24. Making Plays says:

    His head is more egg shaped than egg itself 😀

  25. TheTechnoEcho says:

    That intro made me jump.

  26. Dane Reviews says:

    If Intel is going for the performance crown they will most likely demand a premium for it, much like nvidia does, but it could force nvidia to drop prices

  27. bobbavet says:

    Raja Koduri is a myth.

  28. Dex4Sure says:

    By the time its out, Nvidia already have 7nm out wiping the floor with it. People are too excited over these GPU's. I expect to be a flop just like when they tried to compete with Qualcomm/ARM on mobile processors and then later with Qualcomm on 4G and 5G modems, both times Intel failed miserably. Why would this time be any different?

  29. Adnan says:

    I don't belive in intel gpu's for a second. I am disgusted by that very thought. But thats for the next 10y,maybe 5y but now its a joke.

  30. SylverRaptor says:

    I want my Raja Koduri signed 150$ priced 8 GB 5700 competitor. With 3d stacked RAM. Go Intel, go!

  31. Michael Nager says:

    It's Raja Koduri we are talking about – of course they are going to be shite.

  32. CastAway_Dave says:

    Since they got people from AMD, id bet it will resemble AMD under the hood. Trade secrets?

  33. Harout Darmanchyan says:

    Someone forgot to bubble that tripod head lol
    Also maybe a tad bit too much headroom. I don't know…maybe it's just me. That last part was pure sarcasm.

  34. Alexander Yordanov says:

    The Gaming market may be huge, but it is not important for humanity. All it has is money and even there all the tenets of the new industrial revolution will overtake it.

  35. beachboy boobybuilder says:

    If these intel gpu's turn out to be very good, then that means raja deliberately fucked polaris and vega (which I have been saying ever since he jumped ship to shittel where he had a very cushy VP of gpu job waiting for him. How convenient).

  36. Rhayr Harry says:

    Third horse in gpu race would be fantastic.

  37. TxT Peer says:

    Samsung GPUs ? Coming soon ?

    AMD and Samsung's GPU Licensing Deal
    Nvidia Will Team Up With Samsung For 7nm GPU Technology
    so what is next Samsung are launching their 1st GPU ?

  38. NANOHORIZON says:

    Intel is going to bust Nvidia up at the high end while undercutting them slightly. Nvidia's the one they're after, not AMD. AMD (thus far) can't compete at the high end.

  39. Peter Jansen says:

    MLiD is on my blocklist. Too bad, I like Coreteks.

  40. Jack Phillips says:

    Your Outro is Iconic

  41. beachboy boobybuilder says:

    The basic, entry level shittel gpu will cost at least £300.

  42. Skylancer727 says:

    I think PC gaming really took off this generation thanks to how weak the PS4 and Xbox One was. They really disappointed people, plus some people just prefer high frame rates over 4K, but they just don't offer that.

  43. tamarockstar45 says:

    I just read the wccftech article. "You can think of EUs as the number of cores, similar to NVIDIA’s CUDA core and AMD’s Stream processor core count" Sounds like it's 512 processing units, not 4096. This is probably their mobile parts.

  44. ShadyKillas 69 says:

    I just want a gpu that can run full ray traced 4k graphics at 250fps. Will we see that any time soon… nope but in my lifetime I can hope 🙂

  45. Slane583 says:

    Intel may not be my first choice, but I'd still try an Intel graphics card over any overpriced stuff nVidia offers. Besides, I think Intel could give nVidia a better fight than AMD because Intel's R&D budget is bigger than nVidia as a company. 🙂

  46. Jo Grey says:

    Watch nvidia start making chipsets for AMD again… then we will have PC's with AMD CPUs, Nvidia chipsets, and Intel GPUs… that would be a strange world to live in lol.

  47. AsianPerson3 says:

    I felt this coming. Intel has been working on their igpus for years. I believe they have been testing the waters now a Tsunami approaches. I'm holding on for dear life. Come on Intel!

  48. TroubleShooter says:

    Intel is currently ignoring the consumer desktop market to concentrate on laptops and datacenter servers. My guess is that Intels gpu will do the same. It will be a gpu aimed at the VERY expensive data center market where gpu's cost $3000 or more. Anything we get in the consumer side will be the leftovers and binned failures. Thats my guess anyway.

  49. Arthur of Legend says:

    Bought an RX 5700 and the drivers were shit and my whole system just didn’t want to fucking cut it so I just returned it and bought an RTX 2060 Super. New drivers are really good as well. AMD needs to also step up their CS department as telling me to fucking boot into my bios and update it is child’s play.

Leave a Comment

Your email address will not be published. Required fields are marked *