Current time: 29 Apr 2024, 04:21 PM



RIP NVIDIA 2016
Offline Slingin

Registered User
Registered
12 Years of Service
Posts: 356
Threads: 77
Likes Received: 0 in 0 posts
Likes Given: 0
Joined: Sep 2011
Reputation: 22
Location: Mississippi
Bronze MedalSmG YouTuber
#11
RE: RIP NVIDIA 2016

(31 May 2016, 10:45 PM)SmG Ray88 Wrote: [Image: 1464748363171193.jpg]
I CANT EBEN

I FEEL BAD FOR THOSE THAT BOUGHT THE FE 1080 LMAO

#MAKEAMDGREATAGAIN
#RIPNVIDIA2016

just imagine the 480 in crossfire running at 100% gpu utilization
Take it from someone who ran an r9 295x2 for the last few years.  If you are lucky enough that your game is optimized for AMD at launch, chances are crossfire support will still take an asinine amount of time to perform up to par.  That's assuming they even support it at all.  My last two cards were AMD, and the inconsistencies with there driver support drove me mad.  I am happy to move back to Nvidia.  A great friend of mine is a Nvidia fanboy, and no matter how much I argued with him- at least he didn't have to deal with the shit I had to.  I purchased a 1080 for 1440p 165hz, and it will hold me over well until a 1080ti launch.  AMD makes a hell of a product, but for me the switch was a simple decision.  Made 600 off my r9 295x2 on eBay anyway.
[Image: Signature%20SMG_zpsp3dbyxce.png]
01 Jun 2016, 02:53 PM
Website Find
Offline Ray88

Senior Member
Senior Member
******
12 Years of Service
Posts: 4,365
Threads: 237
Likes Received: 0 in 0 posts
Likes Given: 0
Joined: Sep 2011
Reputation: 29
GhostLegendary
#12
RE: RIP NVIDIA 2016

Im guessing you're missing the whole point of dx12 improvements.
[Image: 869aa92efa.jpg]
i7-5820k @ 4.7 ghz, Asus X99 sabertooth TUF, Kingston DDR4-3200 16 GB 
Corsair H110i GT AIO, Samsung 850 Evo 250 GB x 2, EVGA GTX 1080 Ti SC2 Hybrid
 Asus ROG Swift PG279Q 1440p/165hz, Corsair AX760w, Corsair Obsidian 750D
[Image: 2348111620.png]
01 Jun 2016, 04:01 PM
Find
Offline Slingin

Registered User
Registered
12 Years of Service
Posts: 356
Threads: 77
Likes Received: 0 in 0 posts
Likes Given: 0
Joined: Sep 2011
Reputation: 22
Location: Mississippi
Bronze MedalSmG YouTuber
#13
RE: RIP NVIDIA 2016

(01 Jun 2016, 04:01 PM)SmG Ray88 Wrote: Im guessing you're missing the whole point of dx12 improvements.

Dx 12 does not magically solve problems with optimization on its own.  Either way it's yet to be seen what the true impact of title In "true" DX 12 is on the new architecture.
[Image: Signature%20SMG_zpsp3dbyxce.png]
01 Jun 2016, 05:56 PM
Website Find
Offline SmG xBr4v3x

Senior Member
Senior Member
******
11 Years of Service
Posts: 374
Threads: 43
Likes Received: 0 in 0 posts
Likes Given: 0
Joined: Mar 2013
Reputation: 9
Location: Stormwind
Discord: LukeWasWrong Admin#5075
Hand-held/Mobile GamerSuper ComputerBronze MedalOverwatch
#14
RE: RIP NVIDIA 2016

(01 Jun 2016, 04:01 PM)SmG Ray88 Wrote: Im guessing you're missing the whole point of dx12 improvements.
Again, in VERY few games out now, and VERY few are being developed in DX12. It's like saying how amazing the Windows Mobile OS is, but yet, not that many apps are on it, and not many are being made for it.
You Gotta be Brave to Embark On The Trail To Success
==============================================



==============================================
|||OLD|||
[Image: 5351139349.png]
|||NEW|||
[Image: 6151473987.png]




==============================================
01 Jun 2016, 06:05 PM
Find
Offline Ray88

Senior Member
Senior Member
******
12 Years of Service
Posts: 4,365
Threads: 237
Likes Received: 0 in 0 posts
Likes Given: 0
Joined: Sep 2011
Reputation: 29
GhostLegendary
#15
RE: RIP NVIDIA 2016

Since when have we disregarded direct x series? DX12 is the future whether you like it or not.
Considering AMD has put in a massive amount of R&D into DX12 with supporting applications such as vulkan, mantle, and async compute i only see AMD getting better.

The main reason xfire/sli was bad in dx11 was because they used alternate frame rendering (AFR) which caused latency between the communicating gpus as frames were queued one after the other which notoriously caused micro stuttering. Basically the two gpus communicated between each other acting as ONE gpu.

Split frame rendering (SFR) on the other hand (implemented into dx12) causes each gpu to focus on a specific region of the screen and act independent of each other, dramatically decreasing latency. One gpu would be designated for the left hand of the screen while the other gpu for the other half of the screen.
[Image: 869aa92efa.jpg]
i7-5820k @ 4.7 ghz, Asus X99 sabertooth TUF, Kingston DDR4-3200 16 GB 
Corsair H110i GT AIO, Samsung 850 Evo 250 GB x 2, EVGA GTX 1080 Ti SC2 Hybrid
 Asus ROG Swift PG279Q 1440p/165hz, Corsair AX760w, Corsair Obsidian 750D
[Image: 2348111620.png]
(This post was last modified: 01 Jun 2016, 09:02 PM by Ray88.)
01 Jun 2016, 08:48 PM
Find
Offline Beasty

The Lord thy God
Senior Member
******
11 Years of Service
Posts: 7,654
Threads: 206
Likes Received: 14 in 11 posts
Likes Given: 0
Joined: Jun 2012
Reputation: 124
Location: Massachusetts
Discord: betsy#4548
WriterEmerald DonatorJokerOverwatch5KLegendary
Discord Warrior
#16
RE: RIP NVIDIA 2016

(01 Jun 2016, 08:48 PM)SmG Ray88 Wrote: Since when have we disregarded direct x series? DX12 is the future whether you like it or not.
Considering AMD has put in a massive amount of R&D into DX12 with supporting applications such as vulkan, mantle, and async compute i only see AMD getting better.

The main reason xfire/sli was bad in dx11 was because they used alternate frame rendering (AFR) which caused latency between the communicating gpus as frames were queued one after the other which notoriously caused micro stuttering. Basically the two gpus communicated between each other acting as ONE gpu.

Split frame rendering (SFR) on the other hand (implemented into dx12) causes each gpu to focus on a specific region of the screen and act independent of each other, dramatically decreasing latency. One gpu would be designated for the left hand of the screen while the other gpu for the other half of the screen.

But that's the problem, directx 12 isn't that big right now. You can say amd is smart for focusing on multigpu set ups to decrease the cost but that relies on  dx12 becoming mainstream relatively quickly. If dx12 isn't supported by everything soon, it's pointless because nvidia will have the time to get on the train, while still winning the meantime by having a superior singular card.

I mean again, if you think dx12 will become widespread relatively quickly than it makes sense, but we can't know that. I'd rather get the 1080 then bank on the market shifting a specific way, especially since I live in america where the 1080 isn't as expensive compared to europe.
(This post was last modified: 02 Jun 2016, 12:15 AM by Beasty.)
02 Jun 2016, 12:13 AM
Find
Offline SmG xBr4v3x

Senior Member
Senior Member
******
11 Years of Service
Posts: 374
Threads: 43
Likes Received: 0 in 0 posts
Likes Given: 0
Joined: Mar 2013
Reputation: 9
Location: Stormwind
Discord: LukeWasWrong Admin#5075
Hand-held/Mobile GamerSuper ComputerBronze MedalOverwatch
#17
RE: RIP NVIDIA 2016

(01 Jun 2016, 08:48 PM)SmG Ray88 Wrote: Since when have we disregarded direct x series? DX12 is the future whether you like it or not.
Considering AMD has put in a massive amount of R&D into DX12 with supporting applications such as vulkan, mantle, and async compute i only see AMD getting better.

The main reason xfire/sli was bad in dx11 was because they used alternate frame rendering (AFR) which caused latency between the communicating gpus as frames were queued one after the other which notoriously caused micro stuttering. Basically the two gpus communicated between each other acting as ONE gpu.

Split frame rendering (SFR) on the other hand (implemented into dx12) causes each gpu to focus on a specific region of the screen and act independent of each other, dramatically decreasing latency. One gpu would be designated for the left hand of the screen while the other gpu for the other half of the screen.

We are talking about the present. Not the future in 5 years. 

You're not hearing us fully. It's smarter to do single vs multi because you're guranteed the full performance possible from your card, not having a guessing game of if the next game out will support CF/Sli. 

Imagine having a pair of shoes, and you can only use both shoes one day out of 7, and only one the other 6 days. That's how CF/Sli w/ DX12 is for now
You Gotta be Brave to Embark On The Trail To Success
==============================================



==============================================
|||OLD|||
[Image: 5351139349.png]
|||NEW|||
[Image: 6151473987.png]




==============================================
02 Jun 2016, 02:09 AM
Find
Offline Ray88

Senior Member
Senior Member
******
12 Years of Service
Posts: 4,365
Threads: 237
Likes Received: 0 in 0 posts
Likes Given: 0
Joined: Sep 2011
Reputation: 29
GhostLegendary
#18
RE: RIP NVIDIA 2016

You're just beating around the bush. Dx12 literally just came out last year and as time goes on, devs will continue to support the latest dirext x. True, it may take a few years and by then nvidia will release volta which supports dx12 better then pascal, but who is to know that just because only a few games support dx12 doesnt mean that later this year a lot more titles show up that does.

xfire/sli isnt even that bad compared to a few years back. Drivers get updated almost instantly to support dual gpu. Plus the fact remains that the best sli games scale at around 90%. If dx12 proves to be true nvidia will be donzo if amd can deliver a gpu that can xfire in the ballpark of 80-90% of the gtx 1080 performance while only costing 57%. No one in their right mind would spend $700 to achieve almost the same performance as a setup that costs less then $500.

When the nda lifts on the 29th of june and the RX480 does have the performance of a gtx 980/r9 fury while only costing 199, rip nvidia. I honestly hope amd does which would cause nvidia to drop their price(s) on their flagship cards.
[Image: 869aa92efa.jpg]
i7-5820k @ 4.7 ghz, Asus X99 sabertooth TUF, Kingston DDR4-3200 16 GB 
Corsair H110i GT AIO, Samsung 850 Evo 250 GB x 2, EVGA GTX 1080 Ti SC2 Hybrid
 Asus ROG Swift PG279Q 1440p/165hz, Corsair AX760w, Corsair Obsidian 750D
[Image: 2348111620.png]
(This post was last modified: 02 Jun 2016, 09:41 AM by Ray88.)
02 Jun 2016, 09:40 AM
Find
Offline Slingin

Registered User
Registered
12 Years of Service
Posts: 356
Threads: 77
Likes Received: 0 in 0 posts
Likes Given: 0
Joined: Sep 2011
Reputation: 22
Location: Mississippi
Bronze MedalSmG YouTuber
#19
RE: RIP NVIDIA 2016

(02 Jun 2016, 09:40 AM)SmG Ray88 Wrote: You're just beating around the bush. Dx12 literally just came out last year and as time goes on, devs will continue to support the latest dirext x. True, it may take a few years and by then nvidia will release volta which supports dx12 better then pascal, but who is to know that just because only a few games support dx12 doesnt mean that later this year a lot more titles show up that does.

xfire/sli isnt even that bad compared to a few years back. Drivers get updated almost instantly to support dual gpu. Plus the fact remains that the best sli games scale at around 90%. If dx12 proves to be true nvidia will be donzo if amd can deliver a gpu that can xfire in the ballpark of 80-90% of the gtx 1080 performance while only costing 57%. No one in their right mind would spend $700 to achieve almost the same performance as a setup that costs less then $500.

When the nda lifts on the 29th of june and the RX480 does have the performance of a gtx 980/r9 fury while only costing 199, rip nvidia. I honestly hope amd does which would cause nvidia to drop their price(s) on their flagship cards.

it's that bad... Thats why I am JUST now switching to a single GPU Card.  I can buy another 1080 later on for SLI, or I can wait for a 1080ti.  Either way thats overkill right now for 1440p, and personally I think gaming on a 27" monitor in 4k is counterproductive.  AMD is smart, but if they want to get into the game at a more hardcore enthusiast level they should concentrate on powerful SINGLE GPU cards and back off the $1500 price tag for dual GPU cards.  I loved my r9 295x2 for what it was, and got it for only $800 at the time - but it was aggravating as hell basically playing off a single r9 290 half the time for the price - even at $800.  No more waiting or wandering switching to single GPU.  By the time DX12 becomes the standard, we will talking about the next generation of cards anyway.
[Image: Signature%20SMG_zpsp3dbyxce.png]
02 Jun 2016, 05:37 PM
Website Find
Offline Ray88

Senior Member
Senior Member
******
12 Years of Service
Posts: 4,365
Threads: 237
Likes Received: 0 in 0 posts
Likes Given: 0
Joined: Sep 2011
Reputation: 29
GhostLegendary
#20
RE: RIP NVIDIA 2016

Amd honestly has no incentive to enter the enthusiast market. Steam reports alone show cards above 980 performance and above account for only a mere fraction of sales. They were specifically targeting the mainstream market and offered a card that has excellent performance/ $ which many people can afford.  Btw sli 1080 isnt overkill if you have a high refresh rate 1440p/1080p monitor.
[Image: 869aa92efa.jpg]
i7-5820k @ 4.7 ghz, Asus X99 sabertooth TUF, Kingston DDR4-3200 16 GB 
Corsair H110i GT AIO, Samsung 850 Evo 250 GB x 2, EVGA GTX 1080 Ti SC2 Hybrid
 Asus ROG Swift PG279Q 1440p/165hz, Corsair AX760w, Corsair Obsidian 750D
[Image: 2348111620.png]
(This post was last modified: 02 Jun 2016, 05:58 PM by Ray88.)
02 Jun 2016, 05:55 PM
Find


Forum Jump:


Users browsing this thread: 1 Guest(s)
SmG Gaming, © 2010-2024
Theme By: «SmG» Cloud
Edited by: «SmG» Wires