[ad_1]
“Client Graphics” raises hopes that it will also be about dedicated Intel graphics cards.
Shortly after the presentation of the Arrow Lake CPUs about a week ago, Intel is preparing for the next event: as the company announced via the X-platform, the “Intel BaseCamp” will take place in India on October 29, 2024.
The announcement not only talks about the upcoming processors around Arrow Lake for the desktop and Lunar Lake for the mobile sector. An “update for Intel client graphics” is also announced.
Let’s grow your business together!
Join us at Intel® Training and Intel® Partner Alliance BaseCamp for exclusive insights into the latest technologies and strategies that will elevate your success.
– Discover Opportunities with Intel
– Intel® Client Graphics updates
– Intel®… pic.twitter.com/0k23zAWvyS— Intel India (@IntelIndia) October 15, 2024
As the X user “9550pro”, who is actually known for his hardware leaks, now suggests, this update could be new information about the dedicated Batttlemage graphics cards.
- At least this would fit with a somewhat older – and at the same time last specific – leak that predicted the arrival of Intel Battlemage before the end of this calendar year.
- The portal 3DCenter also agrees with the assessment of 9550pro: at the very least, the Xe2 architecture would have to be officially shown for dedicated graphics cards if Intel still wants to secure the Christmas business.
So far, Xe2 has only appeared on dedicated graphics cards in a few leaks on Last September, it was suspected to be based on the smaller G21 graphics chip, which, with an OpenCL score of 97,943 points, is roughly on par with the Arc A770 and the RTX 4060.
- Consequently, the BMG-G21 is said to have 160 CUs (Compute Units) – resulting in 2,560 shader units. The maximum clock speed is given as 2,850 MHz.
- In addition, 12 GB of video memory can be found in the listing, but the type is unknown. Currently, GDDR6 is assumed.
Now we want to hear your opinion: Can Intel play a role in the GPU market with the Battlemage generation or do Nvidia and AMD have too much of a lead? What are the arguments against buying an Arc graphics card in your opinion? Let us know in the comments!
[ad_2]