Quantcast
[ 3 / biz / cgl / ck / diy / fa / g / ic / jp / lit / sci / tg / vr ] [ index / top / reports / report a bug ] [ 4plebs / archived.moe / rbt ]

Become a Patron!

/g/ - Technology


View post   

[ Toggle deleted replies ]
File: 213 KB, 1273x740, Nvidia-GTX-1080-Ti-Featured.jpg [View same] [iqdb] [saucenao] [google] [report]
58608145 No.58608145 [Reply] [Original] [archived.moe] [rbt]

Is it worth waiting for the GTX 1080 TI?
Is it overkill? Will it break down before it starts going outdated?

>> No.58608309

NVIDIA is a meme company. It's whole existence is based on overpriced dies. AMD has exactly the same access to Foundries but it makes smaller dies because it wants to keep it low cost.

It's easy to meme people with GPUs because they work in parallel by definition.

When you try to do the same in CPUs, that's why why AMD looks bad.

>> No.58608392

If you're worried about it breaking down, don't be. Nvidia will gimp the drivers long before wear and tear becomes an issue

>> No.58608410

>>58608309
>>58608392
So there's no really good solution?

>> No.58608432

>>58608410
Unless you've got some specific niche that only a 1080/Ti can fill, there isn't much point in spending an absurd amount of money on one to begin with

>> No.58608452

>>58608410

I'm not following the very latest but usually the golden solution is somewhere in the middle. Last time I looked at it I figured that for my 1080p needs an excellent/the best price/performance ratio was an AMD R9 290 that was just being released at the time.

>> No.58608487

>>58608410
buy whats good now
don't bother waiting for something without a release date, you will only end up disappointed

>> No.58608501

>>58608432
Nope. Just want to play recent games at 2k 1440p. Also thinking of buying a 6700k, 32gb ddr4 2133mhz, z170-pro gaming. Maybe again, overkill right?

>> No.58608609

32gb is a total overkill. 8gb is 99% of the time enough so you could go 16gb at a stretch and almost never expect to need more (for gaming).

CPUs play a role but all Intel i7s for the last 7 years are close to each other. Little difference between a 6700k and a 4770k for example.

For 1440p I'd say you'd need something strong but not THE strongest. It depends on the budget of course. But don't go lower than midrange.

>> No.58608640

Rumor has it that it's been delayed 4 months. During testing it burned down the whole factory and only 1.7% survived, but they made some killer pork shoulders with with wood screws.

>> No.58608655

Better get faster RAM of lower GB than more GB of slower RAM. e.g. most games will never need more than 8GB or at a stretch 16GB(if ever). That means better give the saved money for faster RAM, not more RAM.

>> No.58608848

>>58608655
>>58608609
Thank you for the tips.

>> No.58608916

>>58608655
>>58608609
By the way. Is it me or Intel CPU only support at maximum, 2133 mhz?

>> No.58608943

>>58608916

If I recall correctly only the overclocking-supported motherboards go 2400 or more but I can't be sure of exceptions.

But it's not a big deal, some overclocking motherboards are cheap enough.

>> No.58608973

>>58608916

they support 2400 but there are motherboards that can overclock past 3000

>> No.58609037

>>58608973
>>58608943
It checks out. Thank you

>> No.58609558

>amd is shit
>blame nvidia

>> No.58609729

Ill buy whatever second best AYYYYYmd card that comes out in May because i just want freesync and not to be jewed out of my money.

>> No.58610022

>>58608145
Volta is supposed to be a dramatic architectural bump, while Pascal (excluding GP100) is basically a Maxwell retread.

If you have $1k+ or whatever to blow on a 1080 Ti, go for it, but Pascal will more than likely getting the Kepler treatment in late 2017 or early 2018.

>> No.58610269

>>58608655
>most games will never need more than 8GB or at a stretch 16GB

When I built my pc recently everyone said 8gb would be enough, but it really wasn't. I experienced a lot of stuttering in games like Watch Dogs 2 and Just Cause 3 even when the framerate was high. I recently got 2 more sticks to get it up to 16gb and that eliminated all of the problems immediately. I'd say 16gb is the new minimum nowadays, unless it's a sub $500 build

>> No.58610983

>>58610269
i just decided to crack open the old-ass machine i have in the corner to see what new parts it would need to get it up to par for gaming. The motherboard is currently a MCP61PM-GM (http://www.pc-specs.com/mobo/ECS/ECS_MCP61PM-GM/1456) that supports a max of 4 GB DDR2. According to your comment, that's basically gg, throw the board out and start over, correct?

>> No.58611722

>>58610983
A lot of games run fine on 4Gb.

>> No.58611782

>>58608392
I am yet to see anyone provide actual evidence that nvidia gimps old cards with driver updates.

>> No.58611791

I am waiting for it, but mostly for it to drive down prices of the 1070

Got better ways to spend money, like on stocks.

>> No.58611839

>>58611782
It's less gimps and more Nvidia cutting off support right as they launch new cards. Meanwhile AMD provides support much longer which allows older cards to match and sometimes overpower the nvidia card rival of that gen.

>> No.58611868

>>58611839
My 9800GT still receives the latest drivers.

Try using an AMD HD5000 on Windows 10.

if by "support" you mean improvements, sure. But your assertions about AMD are complete shit.

>> No.58611883

>>58608145
if the RX 480x or whatever its called is shit I'll probably go with the 1070.

>> No.58611884

>>58611868
I have a 5770 running W10 just fine anon.

>> No.58611900

>>58611884
My mate's card refuses to work under 10, he had to go buy a new one.

Again, my 9800GT hasn't skipped a beat.

Regardless, I'm still waiting for proof of gimping.

>> No.58611908

>>58608145
No but wait for Vega

>> No.58611910

>>58611900
I believe I'm using 8.1 drivers.

>> No.58611916

>>58611868
>But your assertions about AMD are complete shit.

I'm not saying Nvida stops doing drivers, but really? It's been shown a few times at the very least that AMD cards close the gap on Nvidia cards, The 480 and 1060 are within like 2% of each other now. What the fuck happened to that 10% lead Nvidia bragged about at launch?

>> No.58611949

>tfw my 980 ti hasnt been memed into irrelevancy as fast as i feared

Im a VRfag so if the new ti is a proper screamer card, ill strongly consider buying it.

>> No.58611950

>>58611916
>sometimes overpower the nvidia card rival of that gen.

Name a single case of that happening.

>What the fuck happened to that 10% lead Nvidia bragged about at launch?

AMD got their shit together over time? I'd prefer the card that works optimally the day I buy it.

>> No.58612115

>>58611950
>Name a single case of that happening.

I'll admit to being a fuck up and exgerating, the gaps on cards do close. The 1060/480 one being a recent example. Wouldn't even be bringing this up as much if the card wasn't still $50 more expensive in the states. Even if it kept the lead, the premium was sketchy.

>> No.58612256

>>58611950
>>58612115
Memes aside you can't argue that Nvidia destroys AMD with how they do driver updates. Nvidia cards will get them practically once a month.

And they don't *usually* break them. When the thunderbolt compatibility driver update came out I could eject my 980ti like a usb drive from my system tray.

>> No.58612291

How good will it be for deep learning? Between a 1080 and a Titan sure , but cost /perf.

>> No.58613202

>>58611839
nvidia doesn't cut off driver support period, on top of that you can still download releases without their stupid experience app too

>> No.58614568

Just buy a Titan X you stupid fuck

>> No.58614591

>>58608309

AMD does not have the same access to fabs. They're bound by the wafer supply agreement they have with GloFo, which forces them to pay extra for wafers not fabbed on GloFo's (currently inferior) process.

>> No.58615513

>>58614568
That's for rendering and autocad projects, 1080 is better for playing.

>> No.58615983

>>58614568
The irony, call someone a stupid fuck and then tell them to buy a titan x to play games on.

>> No.58616224

>>58611782

List of old nvidia GPU's I used which never shown signs of driver gimping:
* Diamond edge 3D
* Riva TNT2
* Geforce 3 Ti 200
* Geforce FX 5500
* Geforce 6600 GT
* Geforce 8800 GTS
* GTX 470
* GTX 760 (current gpu)

List of old nvidia GPU's I used which were gimped after a driver update:
* None

>> No.58616617

>>58615983
newsflash: you can play games on it

>> No.58616630

>>58616224
>Diamond edge 3D
surprised you gave them second chance

>* Geforce 3 Ti 200
>* Geforce FX 5500
what a sidegrade

>> No.58616654

>>58616630

Different PCs anon

>> No.58616693 [DELETED] 

>>58614591

There are definitely differences in their deals with the Foundries, of course, but the main reality remains that neither NVIDIA or AMD have their own Foundries and that drives their prices up.

It's not a coincidence every single piece of technology that NVIDIA releases and it faster than AMD's, it aggressively on a higher cost to the consumer.

That reality becomes obvious when you compare a CPU made by Intel, a company with their own Foundries, it would destroy both AMD and NVIDIA on prices.

>> No.58616703 [DELETED] 

>>58614591

There are definitely differences in their deals with the Foundries, of course, but the main reality remains that neither NVIDIA or AMD have their own Foundries and that drives their prices up.

It's not a coincidence every single piece of technology that NVIDIA releases and it's faster than AMD's, it's aggressively on a higher cost to the consumer.

That reality becomes obvious when you compare a CPU made by Intel, a company with their own Foundries, it would destroy both AMD and NVIDIA on prices.

>> No.58616716

>>58614591

There are definitely differences in their deals with the Foundries, of course, but the main reality remains that neither NVIDIA or AMD have their own Foundries and that drives their prices up.

It's not a coincidence every single piece of technology that NVIDIA releases and it's faster than AMD's, it's on a bigger die and aggressively on a higher cost to the consumer.

That reality becomes obvious when you compare a CPU made by Intel, a company with their own Foundries, it would destroy both AMD and NVIDIA on prices.

>> No.58616730

>>58616693
Intel literally broke the law repeatedly, giving kickbacks to companies to buy Intel and bribing companies to release compilers that don't support AMD. They did it for like ten years.

>>58616224
Wasn't it literally 470s that burnt out from a driver update? The 970 3.5GB is also gimped without any drivers needed. They also fuck laptops out the asshole, I can't use HDMI audio because their DRM doesn't recognize it.

>> No.58616738

if you like the name, its your wish whether you should buy or not.

>> No.58616750

>>58616730

>Wasn't it literally 470s that burnt out from a driver update?

Not mine at least.

>The 970 3.5GB is also gimped without any drivers needed
Yes that was a dick move to release it like that. But it was like that from day one and not only after a driver update.

>> No.58616761

>>58616730
> compilers that don't support AMD

That's utter bullshit. We have benchmarks compiled on GCC on not even running on Windows that shows Intel being extremely faster than AMD per thread for the last ~10 years.

Only a delusional AMD user with regrets of their purchase and cognitive dissonance reaching max would even dare to claim AMD CPUs were amazing lately, per thread especially.

Sorry, but you were a moron, you fell for the AMD CPU memes. GPUs are still best from AMD but that's mainly because of NVIDIA being also limited, both have no Foundries.

>> No.58616773

>>58616224

use a gtx today with the latest legacy driver and try then drivers that are 2-3 years ol, you get nearly duoble the fps in most games

the same was true for my gtx580 when using now 358.00 drivers instead of the new one got at least 30% more performace

>> No.58616783

>>58616773

ah shit typo meant ''use a gtx 460 today with the latest legacy driver and try then drivers that are 2-3 years old, you get nearly double the fps in most games

>> No.58616809

>>58616761
The world is older than 10 years anon

http://techreport.com/news/8547/does-intel-compiler-cripple-amd-performance

and it's still happening:
https://www.techpowerup.com/forums/threads/intel-compiler-patcher-boosts-amd-processors-performance.217814/

Companies don't use GCC compilers for their software, they use Intels, because they were paid for it in the past and it has since become part of their toolchain.

>> No.58616833

>>58616773

I tried that GTX 760 with both 340.x and 375.26.x (now) drivers and I see no perceived difference in performance. If their goal was to push me to a newer card they failed miserably.

>> No.58616838

>>58616809

But it's still la moronic statement, only coming from spergs that have regrets for their AMD purchase, because we know for a certain fact that several benchmarks compiled with GCC on UNIX also have similar issues.

The short story is that even if Intel favored Microsoft+Intel in some cases a) it is a minuscule advantage when it happened b) it didn't happen in the overwhelming majority of cases because gcc/other.

As said, sorry, but you were a moron you fell for a meme. Excuses won't save you. Intel is not "evil", they just have their own Foundries, which cost a gargantuan amount of money to build.

>> No.58616867

>>58611839

no its gimping alright, check fallout4 benchmarks. 780ti went from good to piss poor after one nvidia update that introduced useless shitworks functions

>> No.58617249

>>58616617
I didnt think you would get the point,
Go back to /v/

>> No.58617277

>>58616617

Barely. Get two in SLI, you pleb.

>> No.58617771

>>58617277
Waste of money to get 10% more fps.

>> No.58617886

>>58617771

Sry I ommitted the sarcasm flag

>> No.58618074
File: 31 KB, 513x399, 584848.jpg [View same] [iqdb] [saucenao] [google] [report]
58618074

>>58617886
Joke's on you. I doubled your sarcasm.

>> No.58618096

>>58608655
>Better get faster RAM
you're fucking retarded

>> No.58618111

>>58616716
>Nvidia uses a bigger die

This is wrong though. This hasn't been true since the HD 7000 series. The 7950/70, 290/x, and fury chips were all much larger than teir competitors. You could argue the fiji die itself was around the same effective size as the one used in the 980ti/titan xp, but the interposer and HBM swells the size of the fury's chip in a way, as the interposer itself is silicon and can lead to a trashed part if the mounting process doesn't go perfectly.

It's only with polaris that AMD has shrunk their die area back down, but vega is going to be a massive fucking die with all the same trouble regarding interposers and HBM the fury had.

I'm hoping Vega is as big a hit in the GPGPU market AMD is building it to be so we can get a more traditional gaming-centric arch in the near future. They just need that little bit of extra capital and manpower to reasonably support a 3rd arch.

>> No.58618992

>>58616224
Bullshit, my brother has a 760 and my 750Ti (Maxwell) outperforms his in everygame now. You are delusional.

>> No.58619272

>>58618992
(X) Doubt

>> No.58619473

>>58618111
Hawaii and Tahiti were both around the size of its competition, which were GP104 and GK110.

>> No.58619523

>>58619473
GK110 was fucking enormous, there's no way Tahiti was close to its size.

>> No.58619532
File: 51 KB, 522x683, 1477775605933.png [View same] [iqdb] [saucenao] [google] [report]
58619532

>>58611782

>> No.58619546

>>58611782
If you cover your ears and pray to God the problem will go away.

>> No.58619770

>>58616730

>bribing companies to release compilers that don't support AMD. They did it for like ten years.

Wasn't it on a Cyrix cpu that people found out it would cut processing time almost in half with certain software just from faking a Genuine Intel identifier?

>> No.58619779

>>58619770
That never happened.

>> No.58619906

>>58619779

It did, I just don't remember what cpu it was exactly.
IIRC it was some random dude that went and masked the cpu identifier in windows for shits and giggles, and found the performance difference by chance.

>> No.58619927

>>58619906
Nobody does that 'by chance', it's obviously fabricated

>> No.58619999

>>58612256
>Practically once a month

AMD does also anon, sometimes more than once.

>> No.58620206
File: 7 KB, 318x203, 1455716155408.png [View same] [iqdb] [saucenao] [google] [report]
58620206

>>58612256

>> No.58620910

>>58619532
>amd drivers are shit at launch and need 2 years to mature
>nvidia drivers at good at launch and can't get much better over time

still not gimping

>> No.58621184

>>58620910
It's also that they don't bother to optimize new games on old hardware, whereas AMD has stuck with an architecture so long that 4+ year old cards are largely benefiting from optimizations meant mostly for newer cards.

TeraScale optimization efforts ended a while ago, but GCN support will have stuck around much longer than Fermi/Kepler thanks to Polaris probably being sold into 2018.

>> No.58621195

>>58621184
all true
still not nvidia gimping their older cards

>> No.58621272

fUCKING POOR FAGS

titan xp or bust

>> No.58621615

Volta is coming at the end of 2017, 12nm based chips

>> No.58621683

>>58608432
Is it niche to want to run newer games at over100hz~ across three 1440p monitors?

I see this as the only leap to a better experience from my current setup.

>> No.58621695

>>58621195
Agreed, Nvidia doesn't gimp old cards, they just misrepresent what their products are to the point of borderline fraud.

They portray their products as being top-of-the-line hardware, but in reality you're primarily leasing the attention of their enormous driver optimization and dev consultation team for a year or two, and the raw hardware is arguably underpowered in significant measures compared to AMD.

Nvidia is great if you plan to upgrade every 1-2 years (and can successfully pawn off old cards on eBay before people realize they're now lemons), but you'd be an idiot to believe they're good long-term purchases.

>> No.58621712

>>58621683
[email protected]/144Hz is coming this year, anon.

>> No.58621784

>>58621695
Nvidia cards when released allready present close to 100% of their performance capabilites.

AMD needs time.

That's where the meme comes from with AMD cards aging like old wine.
I would be pissed to have a potent card but the drivers are so shit that it takes a year or so until it can translate into more performance in games.

>> No.58621882

>>58621784
DX12/Vulkan will shake up the old regime, since it forces game engine devs to put all optimizations in the game (and hence available to all users) rather than having Nvidia/AMD bundle rewritten shaders for every AAA game ever made inside the drivers.

It speaks well for Nvidia's driver team that their DX11 speeds usually match DX12 performance, but it's also a sign that they're going to lose their edge of premier DX11 performance in the long run.

>> No.58622211

>>58621882
>vulkan
DOA
>dx12
it's going the way of dx10

>> No.58622614

>>58622211
> all this damage control

both new API models are really for the main engine vendors, of which there are relatively few, all of which are already doing vulkan and/or dx12 versions.

Nvidia will give a lot more support in the future, assuming they finally have a non-garbage implementation of async compute in Volta.

All the different approaches basically lend to the same thing: overlapping post-processing on one frame with the geometry/shadow map rendering for the next, which you can do pretty easily in vulkan/dx12 or more painfully (but still possible) in dx11 shader rewrites.

>> No.58623354

>>58622211
>this API exposes how shit my company's products are, better call it DOA since i dont know anything about it.

>> No.58623530

>>58623354
>3 games
sorry my man, it is DOA
I know vulkan is great (for both nvidia and amd cards btw), but its the truth

>> No.58623860

Nvidia won't drop the 1080Ti until Vega is out.

>> No.58624127

>>58623530
>muh vulkan is only for gaymes

>> No.58624225

>>58621615
thats the profesional card

your gaymer cards wont be out till 2018 and will be followed by refreshed Vega.

>Nvidia is great if you plan to upgrade every 1-2 years (and can successfully pawn off old cards on eBay before people realize they're now lemons)

so much this. Ive bought several AMD/ATI cards that are even better then when when i bought them and all of them still work.

I have had two kepler cards and a maxwell card fry in my system. Also the Keplar cards went completely to shit after maxwell 900 series released. Literally all the games released post maxwell had striahgt up criminal levels of non-optimized drivers.

Hell even a 700$ gtx-780ti its getting fucking stomped by the R9-290X in modern titles. That alone should tell you all you need to know about the longevity of nvidia. `

Already the performance of Polaris is above what I expected. It kicks the shit out of the 1060 at 1440p and it even seems to run some nvidia optimized games better. With a 8gb 480 that has a strong overclock you can actually run The Witcher3 with Hairworks on (high settings 4xAA or Ultra settings 2xaa) and a Tesselation limiter set to X16 and you can keep high 50's up to a 60fps lock. The 1060 only gets a 60fps lock when Hairworks is completely off. Even a damn 1070 cannot maintain a perfect 60fps lock with hairworks on max settings.

Its ironic and enraging that all the "reviewers" all of the sudden run benchmarks with Hairworks disabled by default. These same shady fucks already used Hairworks to gimp AMD in the past and all benchmarks had HW on but now that Polaris actually handles hairworks better than its equivalent Nvidia card they keep that setting turned off for bench marking because it makes nvidia look extremely bad.

>> No.58624336

>>58624127
it literally is

>> No.58625079

>>58621615
>Volta is coming at the end of 2017, 12nm based chips

protip: TSMC 12nm = 16nm v2 + marketing

>> No.58625457

>>58624336
It also supports android for phones as well. Its completely open source. If you have the time and skill you can make it onto any platform.

>> No.58626384

>>58625457
again, I know vulkan is great
that makes no difference though

>> No.58626684

should i grab a 1080 now or wait for this shit?

>> No.58626723

>want to play games on max settings
>decide to get 1080
>this is coming out soon
>dunno if i should wait

>> No.58626726

>>58624225
My 1070 is doing 66-70fps in white orchard exterior, ultra with 8x hairworks aa bro. Witcher seems poorly optimized to me, I don't see why people even use it as a standard bench.

>> No.58626795

>>58626684
1080 ti will minimum have 1000 extra shaders over a 1080, Vega's quality will set the price for it at either $600. $700, $800 so no one can answer till then.

>> No.58626943

>>58625079
this. Its literally fucking nothing. The clock gains wont be like the jump from 28nmP+ to 16nmFF+ and thats literally the majority of the gains made by pascal. The Arch improvements were negligible. Unless Volta is a fundamental top own redesign (which it isnt) you wont see much other than hardware level Async compute will will only improve VR latency. Volta will literlly be marketed as the nest gen VR ready graphics cards and all of the benchmarks showing massive jumps in performance will be with the *for VR only* in them.

>> No.58626951

>>58608145
Depends
1080ti is poor man's Titan XP

>> No.58626952

>>58626726
>White orchard
nigger come back to me with numbers from Novgorod

>> No.58626962

>>58626795
technically the Titan has two SM's disabled for yields. So now the the process is matured they could release a full die with more shaders than the titan if they wanted to...

>> No.58626996

>>58608145

8gb video cards are already overkill.

>> No.58627122

>>58616773
show me the fucking data.

show it to me.

>> No.58627127

>>58618992
you too. show me your data.

>> No.58627148

>>58626996
nope. Alot of games use over 4gb so the spec either has to be 6gb or 8gb and they deternine that literally by the layout. They have a standard Module with a certain capacity and they put as many of them and their traces on the PCB as is required for the set bandwidth. So a 192-bit 1060 gets 6gb and a 256bit gets 8gb and etc.

If a game even only draws 4.05 gb of memory that alone justifies 6/8gb cards until the draw gets close to 8gb and you need more.

HBM was a major advancement because it redesigned the memory layout to increase bandwidth without having tomake unneccesary traces or stack unessecary VRAM. You got a 1024 I/O and you can stack up to 8 dies high so you that tailor the Bus width to the memory capacity more fine-grained than normal GDDR.

>> No.58627238

>>58621712
Nothing would be able to run it though. Probably not until 2018

>> No.58627288
File: 116 KB, 500x820, memory more like MEMEry.png [View same] [iqdb] [saucenao] [google] [report]
58627288

>>58627148
A game sucking down as much VRAM as it can doesn't mean it actually needs it.

>> No.58627433

>>58627288
is the Fury X the best available?

>> No.58627485

>>58627288
tell me that when you have an 8gb FuryX

>> No.58627765
File: 50 KB, 524x700, max comfy.jpg [View same] [iqdb] [saucenao] [google] [report]
58627765

>>58608501
i play witcher 3 maxed 1440p at 60+ fps, i have 1070 and 6600k, so a 1080 itself would be more than enough for your needs

>> No.58629110

i run two 1080 in sli on a 1440p 165 hz ips monitor

165min fps on most games, feels good man

>> No.58629122

>>58626996
You can never have enough vram

>> No.58629191

>>58608145
Wait for RX 490 instead.

>> No.58630296

>>58621615
>WCCFTECH

Man I remember the 25nm twelve core Phenom IV too my boyos

http://web.archive.org/web/20150413165626/http://wccftech.com/rumor-amd-phenom-iv-x12-170-baeca-25nm-cpu-leaked-features-12-cores-6-ghz-core-clock-am4-socket-compatbility/

>> No.58630424

>>58608145
>Is it worth waiting for the GTX 1080 TI?
>Is it overkill? Will it break down before it starts going outdated?

>>>/v/
Real tech enthusiasts don't play videogames

>> No.58630566
File: 9 KB, 274x290, 1484428244585.png [View same] [iqdb] [saucenao] [google] [report]
58630566

>>58619927
>>58619779

>> No.58631206

>>58619532
It's just amd starts with really shitty drivers

>> No.58631221

>>58616783
>double the fps

fucking end yourself

>> No.58631271

>>58626951
That makes literally no sense at all. The "ti" GPUs are usually better than the "Titan" of the same generation.

>> No.58631276

I'm waiting for the 1170 myself.

>> No.58631307
File: 216 KB, 700x715, 13a461ae9fbdf649d85b24f94311bdd2.jpg [View same] [iqdb] [saucenao] [google] [report]
58631307

>>58608145
>Is it worth waiting for the GTX 1080 TI
No

>Is it overkill?
Yes

>Will it break down before it starts going outdated?
Yes

>> No.58631325

>>58631307
How is it overkill for 5k resolution?

>> No.58631345

>>58631276
I'm waiting for the 2670 myself.

>> No.58631352

>>58621272
>Rich enough for $999 gpu
>On 4chan

>> No.58631359

>>58631345
The reason I'm waiting for that one is because I have a 770 still working fine today, and I'd like my replacement to do well with VR in the future.

>> No.58631361

>>58631276
which will just be a rebranded 1080

>> No.58631366

>>58631361
Are you telling me to wait another iteration?

>> No.58631384

>>58631366
there are no good games to play with high end hardware anyway. Only reason to spend money on a GPU is to play the same old games in 4k.

>> No.58632606

>>58610983
Well if you wanna run newer games it won't be sufficient but there will still be plenty of games you could play. Tbh senpai if it runs ddr2 ram I doubt anything in there is usable for anything other than low resource games, shit posting and media consumption.

>> No.58633087

>>58608145
No
Get TITAN XP

>> No.58633506

>>58608501
>32 GB ram
>2133 mhz

Get 16 gigs of way faster ram. You'll never use 32, but faster ram can actually be a benefit.

>>
Name (leave empty)
Comment (leave empty)
Name
E-mail
Subject
Comment
Password [?]Password used for file deletion.
Captcha
Action