Graphic Cards HBM 2.0 vs GDDR5X

Ray

Disciple
As we all know the pascal gpus are absolute beast and have blown away nvidias current offering.The GTX1070(Which cost around ₹24k in India(same as gtx 970) is more powerful than the Titanx.It has made the current gen gpus obsolete

The pascal gpus have gddr5x.
I was wondering how the hbm2 memory compares to gddr5x.
Will the pascal successor also make pascal obsolete?
 
The GTX1070(Which cost around ₹24k in India(same as gtx 970) is more powerful than the Titanx.
1080 is more powerful than the titanX and I guess post the 10xx series, nvidia would just indulge in some mundane iterative upgrades and nothing innovative/breakthrough will come out for atleast a couple of years!
 
Not really, GDDR5x is almost as good as HBM and will be around for quite a while.
HBM2 is expensive and will only be used on high-end/enthusiast gpu's until Q3 2017 at least so if you're holding off on an upgrade just because it's not being used on mainstream cards, you'll have to wait a long time.
You can read these articles for specs and details :
http://www.anandtech.com/show/9883/gddr5x-standard-jedec-new-gpu-memory-14-gbps
http://www.anandtech.com/show/9969/jedec-publishes-hbm2-specification
 
HBM2 is expensive and will only be used on high-end/enthusiast gpu's until Q3 2017 at least so if you're holding off on an upgrade just because it's not being used on mainstream cards, you'll have to wait a long time.
The question is, do we need such raw power? I think gddr5x is more than enough for casual eye candy gameplay. ( I dont think competitive gamers need such insane resolutions)
 
The question is, do we need such raw power?

That is subjective isn't it ? Things wouldn't have been where they are in the tech industry if everyone would've thought that way, just a few years back people were satisfied with normal phones with small screens and yet here we are today with massive glass slabs with 2GHz+ cpu's and 4GB memory.
Competitive gamers might not need such insane resolutions but then again competitive games are also less demanding and are built to run on lowest spec configurations w/o a hitch to reach a wider player base.
Thing is there's no gpu atm that can completely utilise the massive bandwidth that HBM2 offers, using it will only increase costs atm w/o providing any added advantage, GDDR5x is indeed sufficient for now but the industry won't sit around doing nothing, they think that VR and 4K is the future and they are preparing for it.
 
Anybody remember RD RAM vs DDR RAM? Or what about HD DVD vs Blue Ray?

Well, HBM or GDDR5X could end up the same way. Personally, HBM has more likelihood of ending that way than GDDR5X. HBM is costly and has a different architecture requirements for use while GDDR5x is completely compatible with existing memory architectures while providing its bandwidth benefits.

Still market success is not determined solely by technical superiority or cost/performance balance. There can be other factors as well. Blue Ray won over HD DVD because the porn studios chose to go with it.
 
Hmm no that's an unfair comparison as they are not direct competitors, this is more like SATA 2.0 vs. SATA 3.0 or PCI-e 2.0 vs. PCI-e 3.0 or USB 2.0 vs. USB 3.0 i.e., the newer one will slowly phase out the older one because of the significant advantage and performance boost it provides.
 
That is subjective isn't it ? Things wouldn't have been where they are in the tech industry if everyone would've thought that way, just a few years back people were satisfied with normal phones with small screens and yet here we are today with massive glass slabs with 2GHz+ cpu's and 4GB memory.
Competitive gamers might not need such insane resolutions but then again competitive games are also less demanding and are built to run on lowest spec configurations w/o a hitch to reach a wider player base.
Thing is there's no gpu atm that can completely utilise the massive bandwidth that HBM2 offers, using it will only increase costs atm w/o providing any added advantage, GDDR5x is indeed sufficient for now but the industry won't sit around doing nothing, they think that VR and 4K is the future and they are preparing for it.
I think gaming/GPU is just a part of the whole ecosystem and doesnt really direct the industry. For instance, someone who doesnt play games can very well do without pascal or even 5-6 generations old hardware for that matter. The real driver here is the operating system/software. If the OS/software demands newer hardware, the user has no other options but to upgrade. Look at the evolution of windows.....






















or is it the other way? o_O
 
I think gaming/GPU is just a part of the whole ecosystem and doesnt really direct the industry. For instance, someone who doesnt play games can very well do without pascal or even 5-6 generations old hardware for that matter. The real driver here is the operating system/software. If the OS/software demands newer hardware, the user has no other options but to upgrade. Look at the evolution of windows.....
or is it the other way? o_O
I beg to differ. Gaming industry is solely responsible for the evolution of gaming H/W like GPU. OS/software has nothing to do with it. There was a time when games were made for the existing H/W. Now the H/W is made for the gaming needs.
 
@fleshfragger That's what I meant when I said it's subjective, just because you don't have the need for something doesn't mean there isn't a market for it elsewhere. You're speaking for yourself if you think there aren't any people who want to play TW3 or BF on ultra high resolutions with all the bells and whistles turned on.

My dad uses an ancient inspiron but he's still using it and will keep doing so until it gets in the way of his work but it hasn't stopped other manufacturers from releasing high-end ultraportables and laptops, I'm still using a 3 year old SGS3 that I have no plan on replacing until it breaks in two but it hasn't stopped Samsung from releasing something with curved screens and beastly specs, I have a friend who plays only dota2/cs:go and is therefore content with the performance of the integrated graphics on his Haswell but it hasn't stopped Nvidia/AMD has it.

It's software dependent in a way but it also depends on what kind of experience you are content with, for e.g., I had a friend who used a 9600GT for almost 5 years and with every new title he had to reduce his resolution to the point where he was gaming at 640x480 :p
Yes he could play games like Crysis 2 but was it an enjoyable experience ? ofc not.
 
I beg to differ. Gaming industry is solely responsible for the evolution of gaming H/W like GPU. OS/software has nothing to do with it. There was a time when games were made for the existing H/W. Now the H/W is made for the gaming needs.

Just putting a question,hypothetically, lets say for example, MS doesnt ever go the VR way i.e. they release iterative functional system updates which are independent of your graphics processor, would your neighbour ,who doesnt game ever need such advanced graphics processors? Look at all the offices here. They still run core 2 duo processors. I think these advancements are targeted towards a specific audience alone.? Whats your take on this?
 
Just putting a question,hypothetically, lets say for example, MS doesnt ever go the VR way i.e. they release iterative functional system updates which are independent of your graphics processor, would your neighbour ,who doesnt game ever need such advanced graphics processors? Look at all the offices here. They still run core 2 duo processors. I think these advancements are targeted towards a specific audience alone.? Whats your take on this?
A counter question, which is already asked by @psyph3r but I still would love to ask you. Do you think the world will stop spinning because you like winter more than summer?
 
hat's what I meant when I said it's subjective, just because you don't have the need for something doesn't mean there isn't a market for it elsewhere. You're speaking for yourself if you think there aren't any people who want to play TW3 or BF on ultra high resolutions with all the bells and whistles turned on.
Heck Im going for a sli based system the day pascal is out! :p

I wanted your views on this topic!! Not trying to impose my thoughts! :)[DOUBLEPOST=1462980734][/DOUBLEPOST]http://www.eurogamer.net/articles/2...on-graphics-tech-the-limit-really-is-in-sight

This article explains beautifully how our eyes would eventually bottleneck increments in graphics. And i think it's true. It's extremely hard to distinguish b/w resolutions once you breach full hd!
 
Last edited:
I think gaming/GPU is just a part of the whole ecosystem and doesnt really direct the industry. For instance, someone who doesnt play games can very well do without pascal or even 5-6 generations old hardware for that matter. The real driver here is the operating system/software. If the OS/software demands newer hardware, the user has no other options but to upgrade. Look at the evolution of windows.....
or is it the other way? o_O

You do realize that gaming and crypt-analysis were the main driving forces behind last 70 years of computing technology advancements. It is an understatement to say that gaming drives computing technology. The Unix operating system was written because it is was a pain to load and run the space invaders that they have written on PDP mainframes. The C programming language was designed because it was a pain to write the Unix operating system and games in assembly. The entire evolution of DOS and Windows is primarily driven by gaming needs. DOS EMS/XMS came into existence because of games. Why has DirectX became an integral part of Windows today? Parts of windows were re-architectured to better meld with DirectX technologies because of games. Why does Apple care so much about the GPU and the graphics API for iPad/iPhone.

Computing industry hardware and software advancements are definitely driven by gaming.
 
I wanted your views on this topic!! Not trying to impose my thoughts!

Lol does my language come across as harsh or condescending ? I think it does but it's never my intention.
Anyways, there was a time when flat screen tv's were luxury goods and super pricey, the first set I saw at a Sony showroom was priced 5 lakhs, would I have ever bought it at that price ? nope, but look at where we are now, every household has at least one and prices have come down ten fold, the same goes for VR and all these high end goodies.
Wait for all these early adopters aka beta testers to iron out all issues so one day peasants like us can own an affordable pair.
 
Computing industry hardware and software advancements are definitely driven by gaming.

I beg to differ. Nvidia didnt put mankind on the moon! Gaming is bi product of tech advancements which are usually made for hardcore scientific pursuits. Even VR has myriad engineering application. Gaming is ofcourse one of them!
 
B
Heck Im going for a sli based system the day pascal is out! :p

I wanted your views on this topic!! Not trying to impose my thoughts! :)[DOUBLEPOST=1462980734][/DOUBLEPOST]http://www.eurogamer.net/articles/2...on-graphics-tech-the-limit-really-is-in-sight

This article explains beautifully how our eyes would eventually bottleneck increments in graphics. And i think it's true. It's extremely hard to distinguish b/w resolutions once you breach full hd!


1. Resolution is not the only thing that matters.
Nowadays many games are open world with huge maps and and draw distances,which requires larger memory.
The new pascal gpus have 8gb memory compared to 4gb of maxwell.Larger memory requires more powerful processor and faster memory.which is why there is a sudden interest for faster memory types in the industry with gddr5x and hbm.

2.Fir a long time we had been stuck with 1080p but this year we have 2 new technologies
VR and 4K.

These 2 technologies will rapidly push gpu and cpu in the next few generations.I expect pascal and its successor to make HUGE improvement over its predecessor.
 
. Resolution is not the only thing that matters.
Nowadays many games are open world with huge maps and and draw distances,which requires larger memory.
The new pascal gpus have 8gb memory compared to 4gb of maxwell.Larger memory requires more powerful processor and faster memory.which is why there is a sudden interest for faster memory types in the industry with gddr5x and hbm.

This equates to faster processing times, i.e. framerates. Tell me something, can you really distinguish b/w 60fps and 160fps?[DOUBLEPOST=1462982004][/DOUBLEPOST]
.Fir a long time we had been stuck with 1080p but this year we have 2 new technologies
VR and 4K.
True! I wonder whats after VR and AR!
 
Back
Top