[ 3 / biz / cgl / ck / diy / fa / g / ic / jp / lit / sci / tg / vr / vt ] [ index / top / reports / report a bug ] [ 4plebs / archived.moe / rbt ]

Due to resource constraints, /g/ and /tg/ will no longer be archived or available. Other archivers continue to archive these boards.Become a Patron!

/g/ - Technology

View post   

[ Toggle deleted replies ]
File: 262 KB, 1278x1500, 81uNEDESDYL._SL1500_.jpg [View same] [iqdb] [saucenao] [google] [report]
68441098 No.68441098 [Reply] [Original] [archived.moe] [rbt]

I have a friend with an unspecified "AMD FX" PC which he claims to have built in early 2017.
He has a GTX 1070.
Now, what gets me is that, despite the fact that he swears he isn't overclocking, he gets insane framerates in video games. For example, at 1080p in fortnite at max settings, he gets about 180fps average, 220 max, and about 120 minimum. Mind you, all this with 16GB DDR3.
Meanwhile, my ryzen 7 2700x and 1080 ti get 220 max, about 150 average, and about 90 minimum at the same resolution with 3200mhz ddr4.
Now, my framerates in all games are consistent with other people's ryzen benchmarks, so I'm not concerned, but what the HELL is up with his Bulldozer?
Is he lying about his setup, or is the FX chip actually good?)

>> No.68441146
File: 228 KB, 1278x721, poo fx.jpg [View same] [iqdb] [saucenao] [google] [report]

most likely lying

>> No.68441155
File: 17 KB, 836x768, UPGRADE2010.png [View same] [iqdb] [saucenao] [google] [report]

FPS in games mostly driver related
buying videocard (You) buying not peace of hardware, (You) buying subsribtion on drivers update - videocard just a protection key

>> No.68441177

my FX6300 is based

>> No.68441194

It's absolute garbage and not worth it. Ryzen slaughters it in every way. Save the extra few bucks up for a 2600.

>> No.68441201

FX cpus were a flop on release, pretty outdated now

>> No.68441287

They're good now for super budget servers but thats all. Ryzen surpasses it leaps and bounds.

>> No.68441305


>> No.68441336

Zero chance, any fx chip would be bottle necking that 1070 pretty badly, hell, the fx-8350 was a pretty big bottle neck on the 970

>> No.68441337
File: 120 KB, 1024x766, 03_ARM13.jpg [View same] [iqdb] [saucenao] [google] [report]

There wasn't "much" wrong with the chip design itself. It just game out at a wrong time.
Zen is pretty much identical to Bulldozer, when scaled over the years with the natural performance grows, except multi-core nowadays if much more useful.

I mean, it was a decent *budget* chip around it's release, but no it wasn't good.

>> No.68441354

>They're good now for super budget servers but thats all.
No. They are highly inefficient, you'll make up any different of a FX chip running 24/7 by its power bill against a Ryzen 3 very quickly.

>> No.68441393

There is plenty wrong with the design of bulldozer
And it is a vastly different arch then ryzen.
What the fuck are you smoking.
If by "came out at a wrong time" you mean if it would have released in 2006-08 then
Yeah it would have been good if marketed correctly as a quad core with "hyperthreading "

>> No.68441603

bulldozer got better overtime as more games became multithreaded, but it's still no where compared to ryzen or even sandy bridge CPUs.

>> No.68442638
File: 171 KB, 930x797, One Core per CU.jpg [View same] [iqdb] [saucenao] [google] [report]

Of course it's good if you've still got an 8350, wouldn't recommend getting one now but if you have an 8350 you can still play games.

>> No.68443105

thus tread full of market makers

>> No.68443244

its more down to GPU than CPU 90% of the time which means he is lying or runs at sub par settings there is no way a 1070 is getting more FPS than a 1080ti unless the 1080ti is on a dual core Pentium

>> No.68443251


>> No.68443286

>Zen is pretty much identical to Bulldozer
Stop reading here.

>> No.68443364

Hence the quotation marks
No, it's not vastly different, vastly different would be entire different architecture
Bulldozer was one integrer core and two floating cores as two module, Ryzen is one module as two hyperthreaded cores

That's your problem, you didn't read it all and didn't understand what it was about

>> No.68443378

>Of course it's good if you've still got an 8350
this is such bullshit, the FX-6350 was way better since it could actually get decent single core performance with overclocks when compared to the 8350 and the additional two cores of the 8350 didn't do shit for games when they are running at worse single core performance overall

>> No.68443469

the 6350's where binned worse and clocked worse what are you on about
all FX hit a wall around 4.5ghz besides the 9xxx housefire editions

>> No.68443482

FX hits an air cooled wall voltage of 1.38v, the temps jump up dramatically for mine just to get that extra 100mhz to 4.5 because it needs 1.44v.

>> No.68443584
File: 6 KB, 250x250, oldman.jpg [View same] [iqdb] [saucenao] [google] [report]


meh, so much Gaslighting in this thread

fx has 2mb per core of L2 cache
ryzen has more L3 but only 512kb of l2 cache per core

maybe that bottlenecks the max fps, not a big deal to cause any drama, both cpus are workhorses

>> No.68443599

My 8320 @ 4.4GHz runs most modern games at largely GPU bottleneck with a 980Ti, but it's severely lacking in single thread vs my 1700X.

>> No.68443609

>Zen is pretty much identical to Bulldozer
Coffee Lake is pretty much identical to netburst

>> No.68443624

>fx series
>good for servers
$15 x5650s are leagues better

>> No.68443638

Finding good motherboards for FX is easier, and usually cheaper.

>> No.68443643
File: 74 KB, 504x446, Bulldozer%252032nm_thumb%255B1%255D.jpg [View same] [iqdb] [saucenao] [google] [report]

Bulldozer as an architecture was designed as a server CPU from the beginning then brought to the desktop, 16MB of CPU cache on the FX-8000 was no joke for a CPU that costs less than an i5

X5650 is better but back in 2011 that platform was really expensive and X58 is really expensive now with the boards.

>> No.68443651
File: 295 KB, 380x350, 1533762184657.png [View same] [iqdb] [saucenao] [google] [report]

I had an 8350 running at 4.7ghz on air and it killed absolutely anything I threw at it.

I "upgraded" to a 2700X and barely tell any difference.

>> No.68443662

You can't compare the cache implementations because of the immense differences in SMT implementation and basic architectures.

>> No.68443669


that's strange because I get about 150 mins, 180 avg and 220 max on my Ryzen 2700X + GTX 1070 but I also run a lot of ACTIVE shit in the background

check if your ram XMP profile is enabled, and also make sure that you enable your board and chipset support XFR 2.0 and turn it on

make sure your cooler is adequate and is properly transferring heat

as for the bulldozer performance in modern games, I guess that has to do with processor extensions added to the bulldozer era of CPUs being finally utilized by modern programs squeezing out a bit more performance than games could at the time

>> No.68443687

You can clearly tell the difference games like WoW.
On the other hand, my 8320 ran DOOM 4 at GPU bottleneck running at 1.4GHz. I fucked up some OC settings, and didn't notice until I fired up ESO and had 14fps.

>> No.68443764

>I had an 8350 running at 4.7ghz
Shit dude your cooler must be the size of a football because my Hyper T4 can do 4.5 but that's right at the edge.

>> No.68443782

The FX chips were insane but no way they're outperforming a 2700x

>> No.68443788

NH-D14/D15 compete with AIO watercooling.
CM's Hyper line is pretty shit compared to high end air coolers.
I run my 8320 4.4GHz 1.37v on an NH-D9L.

>> No.68443809

Not to mention that the Zen 1 L3 cache itself is double the bandwidth and lower latency than the previous arch, and a huge bump to L1 and L2 latency and bandwidth.

>> No.68443831

>or is the FX chip actually good?

oh ha ha ha ha ha oh wow, ha holy shit no

>> No.68443843

That, combined with the far more advanced cache prediction makes the cache one of Zen's most advanced features. I still can't quite comprehend the fact they build a mini neural network in to the cache prediction.

>> No.68443864

>He doesn't want the multi core performance of an i7 3770 for cheap.

>> No.68443868

The fx chips are better now then at release. New CPUs are just flaming garbage

>> No.68443879

kys, /v/ermin. back to the hole you crawled out of.

>> No.68443882
File: 98 KB, 640x640, Intel-Pentium-Processor-G4600.jpg [View same] [iqdb] [saucenao] [google] [report]

How dare you

>> No.68443883
File: 377 KB, 561x507, 1541815305776.png [View same] [iqdb] [saucenao] [google] [report]


fx are comfy as hell. always delivering just the needed ammount of power to keep things smooth, even after 5 years thanks to the hardware optimizations.

>> No.68443895
File: 3.17 MB, 4032x3024, 20181110_222503.jpg [View same] [iqdb] [saucenao] [google] [report]

Fuck amd

>> No.68443897

R5 2400G is actually pretty slick, and it's unfortunate that RAM prices are as bad as they are.

>> No.68443911
File: 98 KB, 781x1177, 1503800146174.jpg [View same] [iqdb] [saucenao] [google] [report]

I would if I could.

>> No.68443914
File: 172 KB, 826x818, amd_fx8350_8_core_cpu_1504700744_6b25d1e40.jpg [View same] [iqdb] [saucenao] [google] [report]

He's talking about the botnets, high tdp intel chips, non soldered spreaders.

FX is just a pure lineup of unlocked processors with no Jewish tricks.

>4 phase mobo without heatsink
>FX8300 in the socket
FX is pretty good, fuck that board.

It's nice of them to allow it but 12w chips to not belong on those super low end 760G/ 4 phase 970 boards.

>> No.68443920

That's a 6300, which is fine, but I wouldn't try to OC it very far in the board wihout adequate cooling.

>> No.68443923

>12w chips
125w chips I mean.

>> No.68443925

Its a 6300 not an 8300. But yes this mobo died after 2 years of runnning at 4.3ghz oc

>> No.68443941
File: 265 KB, 800x522, Why the fuck everything single threaded.png [View same] [iqdb] [saucenao] [google] [report]

That's fucking impressive to run that long with no heatsinks.

>> No.68443961

Yeah msi arent as bad as people meme about. It never really died it just became unstable and juttery and i had to underclock it a little

>> No.68443976

>underclock it a little
Sounds like CPU degradation.

>> No.68443995
File: 22 KB, 453x434, death.png [View same] [iqdb] [saucenao] [google] [report]


>been running my FX 6300 at 4.4 for two years on my shitty asus 760g mobo.
>at fucking 1.4v because i was a brainlet that tought that the cores were throttling because of lack of voltage
>its been three years since then

i-its going to be all right, right?

>> No.68444018

just upgraded from a R7 260x to a RX 570 and now my FX 6300 is finally showing his age, it's been a comfy ride FXbros

>> No.68444019
File: 2.50 MB, 4032x3024, RAM and Board.jpg [View same] [iqdb] [saucenao] [google] [report]

I'm surprised his CPU would degrade, FX can take 1.5v+ on the core and be fine with that, is it possible for a VRM to degrade? I've dbeen dailying a 1.38v OC for almost 2 years and sometimes ran 1.44v when I'm feeling extreme and it still holds the oc stable with me beating on the system with prime95 and transcoding X264.

You'll be fine Anon, the FX chip is durable as fuck, and heaven forbid anything happen, gigabyte 970 boards are cheap.

Nice AMD GCN 1.0 card.

>> No.68444039
File: 14 KB, 294x180, gpu.png [View same] [iqdb] [saucenao] [google] [report]

its actually a HD 7790, lucky me that i went with this instead of a 750ti, the amd finewine really works

>> No.68444063
File: 2.41 MB, 2576x1932, 20180905_205527.jpg [View same] [iqdb] [saucenao] [google] [report]

It really was a finewine series of cards,I just got a 7970 and it's a beast, 3GB of VRAM before the 780ti,

I miss the single slot profile of my HD7750 but playing at 1920x1200 is cool.

>> No.68444209

This is my exact position. I am possession of an unopened 8350 black edition and 16gb 1866 RAM, do I make a build with it or not?

>> No.68444215

I did say super budget

>> No.68444231
File: 1.25 MB, 4000x2248, 2017-08-26-484.jpg [View same] [iqdb] [saucenao] [google] [report]

Hell yeah dude, get a gigabyte 970 board and enjoy it. Set the vcore to 1.38125v to get that 4.2GHz OC and roll with it.


>> No.68444252
File: 262 KB, 894x588, Speccy.jpg [View same] [iqdb] [saucenao] [google] [report]

This baby still does me fine; I don't play new games and I don't keep 100+ tabs open either.

>> No.68444366

>FX can take 1.5v+ on the core and be fine with that
It will degrade faster than normal at 1.5v
1.45v is what AMD rates it for with a 5-yr lifetime at stock frequencies.

>> No.68444372

Phenom II is fucking mad CPU.
If you can run it over 4GHz, it's better than Bulldozer.

>> No.68444396
File: 15 KB, 355x355, 41pFlK2grvL._SY355_.jpg [View same] [iqdb] [saucenao] [google] [report]

isn't phenom better than fx: lets throw 3 phenom cores against 3 fx cores

phenom would outperform them
but phenom almost always has less cores than fx

so fx would win in a multicore softwares but only marginally

1 phenom core is almost equal to 2 fx cores
1 ryzen core is 2 phenom cores

clock speed must be sames

>> No.68444411
File: 115 KB, 653x726, 1491349230554.jpg [View same] [iqdb] [saucenao] [google] [report]

Shit FX9590 owners must be fucked because their CPU ships with a 1.5375v VCORE

>tfw 4.2GHz Thuban performs the same as 4.4 GHz bulldozer
The FX uses little voltage to get 4.4 but thuban technically does beat it, even if it needs 1.5v to do it.

One phenom core is 5% better than an FX core clock per clock, put two cores in an FX module under load and the ipc will be about 25% less than a phenom core, but there's 8 of them at a high frequency.

>> No.68444428

>32gb ram


>> No.68444432

>fx would win in a multicore softwares but only marginally
Phenom II has dedicated FPU per core, so a Phenom II x6 will beat an FX-8350 in floating point workloads.
FX really shines in integer workloads, and beats out Phenom II per core as well as multicore.

>> No.68444461

>FX module under load and the ipc will be about 25% less than a phenom core
Only for FP/INT mixed workloads. For server workloads that are heavily INT, Bulldozer completely shreds Phenom II

>> No.68444473

Why not? RAM was cheap back then.

>> No.68444483

>fortnight date intensifies

>> No.68444495

Part of it might be him lying, part of it might be you're still getting almost 1/3 better framerates than he is. A lot of it is that fortnite probably doesn't use multithreading very much, if at all

>> No.68444503
File: 2.07 MB, 4032x3024, IMG_0218.jpg [View same] [iqdb] [saucenao] [google] [report]

It really did beat out phenom when it came to actual work like rendering, I still think an FX-8350 rig with 32Gb of ram to cache things and a lowe power video card is a fucking budget workstation

>> No.68444509

FP heavy workload, so yes, Phenom II wins.

>> No.68444511

Finding out that most benchmarks are owened by jewtel.

>> No.68444552

XFX Single slot coolers are pure sex.

>> No.68444554

Thuban is a good chip but it needs alot of voltage to do what my chip does with it's modules at 4400mhz 1.38v.

If I put my CPU in one core per CU mode which allows me to get 4 real cores at 4.5GHz+ so it performs like a phenom X6 1055T/FX6300

>> No.68444572
File: 153 KB, 765x624, ea74408d70eb8a05ea8cb2f6c7dbc389.png [View same] [iqdb] [saucenao] [google] [report]

>Zen is pretty much identical to Bulldozer

Are you okay?
Do you have brain damage?

>> No.68444574
File: 969 KB, 1932x2576, 20181106_152732.jpg [View same] [iqdb] [saucenao] [google] [report]

I have been trying to search for another used XFX Ghost HD7750, but the people want over $100 for it, so I settled on getting a brand new in the box HD7750 refrence pcb card from HIS to have a spare low power video adapter that performs nicely.

Still on the lookout on eBay if someone's selling one cheap.

>> No.68444599

Phenom II is a based CPU but it's a shame that it doesn't support SSE 4.1 or 4.2

>> No.68444617

no, it's bad
t. former FX-6100 owner

>> No.68444635
File: 1.66 MB, 2576x1932, 20181110_204433.jpg [View same] [iqdb] [saucenao] [google] [report]

Is this not the sexiest single slot GPU you've ever seen in your life?

This, Vishera has the modern instructions and it gives alot of performance at a low voltage compared to Thuban.

>> No.68444671

T. 2700x and a 1080 your friend is a lying cunt and minimum fps on bullshit dozer is garbage

>> No.68444730

My guy! Thank you! You have any advise as far as cooling then? I have the gigabyte 990FXA-UD5-R5 rev1.0 and the hyperx fury for RAM similar to >>68444019
but aryan edition

>> No.68444758
File: 1.59 MB, 4032x3024, IMG_0213.jpg [View same] [iqdb] [saucenao] [google] [report]

I use the Hyper T4 cooler, it's very large, yes it mounts sideways facing the GPU, but it uses the lattch mechanism, which is easy to use and I am lazy.

Couldn't be more happy, 58C with a 4.4GHz OC an 8 cores. at 1.38125v and Medium LLC.

>> No.68444760

Wait a damn minute!

Is that dust that is caked on at the top of your keyboard? Holy hell, now that I'm looking at it closer there is so much dust in that keyboard! This is literally just a little over a year ago... compressed air cans were still pretty affordable and accessible back then.

>> No.68444769
File: 2.58 MB, 2576x1932, 20181110_210146[1].jpg [View same] [iqdb] [saucenao] [google] [report]

Yes it's dust.

>> No.68444789

Wtf I watched an FX-8150/Radeon HD 7970 Fortnite benchmark yesterday and was amazed at how well it was performing and now today I see a thread about Bulldozer performing great on Fortnite. Coincidences are weird.


>> No.68444793

7970 and bulldozer is a match made in heaven, my card sags so fucking much and the tdp is high but it's amazing.

>> No.68444880

It sucks that the HD 7970 had terrible drivers when it first came out. NVIDIA saw how terrible it was performing and chose to release their mid range GK104 GPU as a high end GPU (GTX 680).

Now the HD 7970 absolutely destroys the GTX 680 since the GTX 680 was never a high end chip.

>> No.68444884
File: 1.62 MB, 2576x1932, 20181110_211517[1].jpg [View same] [iqdb] [saucenao] [google] [report]

Okay I cleaned it for the first time since 2014.

>> No.68444919
File: 26 KB, 540x399, Woah mama-mia.jpg [View same] [iqdb] [saucenao] [google] [report]

Do you just not use that computer or keyboard anymore? Why is there so much?

>> No.68444933

Kek, I just never use the 10-key side of it.

>> No.68445009

>I run my 8320 4.4GHz 1.37v on an NH-D9L.
I had the 8320E (just a lower clocked and lower watt version of the 8320) running at 4.6GHz @ 1.404v. It was a really good chip in that regard. With a Phanteks PH-TC12DX for cooling. Sold that chip/setup and got a 4790k when they came around and got that to 4.8GHz @ 1.275v using the same cooler.

>> No.68445017

Is the 8320e super nicely binned or something to get 95w at 3.2? I've seen guys getting insane clocks on it, better than what I have on the 8350.

>> No.68445033
File: 19 KB, 475x413, BOTNETS2.png [View same] [iqdb] [saucenao] [google] [report]

1. at lanch AMD FX was enought for all games
2. now AMD FX is enought for all games
1. at lanch intel 2600 was over enought for all games
2. now intel 2600 is shittier 2 core corelet freesing in all new games

>> No.68445080

No idea. Been years since I owned it / clocked it. 3.2 @ 95w wasn't outrageous when the regular 8320 was 3.5 (I think) @ 125w. Piledriver was mid 2012... so Ivy Bridge would have been the competitor. The 6 core / 12 thread 4930k was a 130w chip with a 3.4GHz base clock and 3.9GHz turbo.

>> No.68445399

FX at launch was pretty shit because 99% of games and applications (barring major production programs) were made for single core rather than multi-core. I owned an Bulldozer FX-4100 at 4.6GHz on air then an Piledriver FX-6300 at 4.9GHz on store bought water, and lastly an FX-9370 at 5.5GHz on custom water. When I pushed for 5.6GHz, I popped 2 voltage regulators and went Intel then. FX were a blast to overclock, had the heat output of blast furnaces, and my God their encoding capabilities and video editing was awesome.

With how common multi-core is in software these days, FX can hold their own still. My buddy still has his FX-8320 at bone stock with his Radeon 7950. They still give him around 60FPS in most titles at medium to high settings at 1080p.

>> No.68445408

I already have a R7 2700x.
And yet, whenever we do lan parties, his framerates are lightyears above mine, despite the fact I've done everything I can to optimize. (And again, my framerates are actually right on par with other Ryzens)

>> No.68445423
File: 133 KB, 287x389, TESTYOU.png [View same] [iqdb] [saucenao] [google] [report]

>FX at launch was pretty shit because 99% of games and applications (barring major production programs) were made for single core rather than multi-core.
so wat? 120FPS vs 205FPS is not advantage on 60FPS monitor
1. at lanch AMD FX was enought for all games
2. now AMD FX is enought for all games
1. at lanch intel 2600 was over enought for all games
2. now intel 2600 is shittier 2 core corelet freesing in all new games

>> No.68445445

Chill my dude he wasnt hatin the based FX.

>> No.68445455

Lol except FX wasn't getting even remotely close to 120fps unless you were playing on low settings. I would know. I had FX since launch. Even with big dick video cards, some titles would dip into the high 30s and barely break mid 50s on an R9 290X or R9 Fury. (Metro, Wolfenstein, WoW before the engine overhaul)

FX had it's uses and was a solid chip given it's price, but don't make it out to be more than it was.

>> No.68445488

120FPS vs 205FPS is not advantage on 60FPS monitor
buyers of intel 2600/2500 retarded

>> No.68445528

Once again. FX wasn't getting 120fps you ESL motherfucker. Look at >>68441146

The 8350 was struggling to reach 80FPS while the 2500k was pinning 120fps easy. You're correct, a 60HZ monitor won't see the difference HOWEVER. Having a higher FPS ceiling gives more wiggle room for FPS fluctuating. The 2500k had 60 more full frames worth of grunt so you'll pretty much be locked at 60FPS at all times. Not so on the 8350 when heavy parts of the game happen. I'm also talking from personal experience with FX. And once again. FX was OK for it's price. And it competes better today than it did at launch due to multi-core optimization. But stop trying to make FX sound like the holy Grail

>> No.68445537

>FX wasn't getting 120fps
so wat? 60fps is enought for 60 FPS monitor

>> No.68445546

also "high" settings dont have any differens from "mid" expect pointless loop for non intel CPUs
"high" settings is SCAM

>> No.68445559


>> No.68445566

He is lying unless you have some problem or different settings. Altavista fortnite benchmarks urself.

>> No.68445572

1. at lanch AMD FX was enought for all games
2. now AMD FX is enought for all games
1. at lanch intel 2600 was over enought for all games
2. now intel 2600 is shittier 2 core corelet freesing in all new games
buyers of intel 2600/2500 retarded

>> No.68445582

Is this an actual legitimate shill? For FX of an things? The broken English and spelling really sells it as such.

>> No.68445586

Tbh, there's a few other issues with my pc that keep me from going maximum overclock.
For instance, Fortnite shits itself if I overclock my RAM past 2400 or my GPU at all (the gpu is factory overclocked, so I actually have to slow it down to 1480 to get the damn game to stop crashing).
But, vanilla-for-vanilla, mine is par with other vanilla setups.
Now, the RAM has passed several passes in memtest at 3200mhz, so I'm honestly not sure why the Unreal Engine hates it, but I at least am fairly sure my ram works. The GPU might be fucked, though.
But this isn't a tech support thread, so I intend on posting that elsewhere when I actually have time to troubleshoot.

>> No.68445593
File: 1.88 MB, 2576x1932, 20181018_170652[1].jpg [View same] [iqdb] [saucenao] [google] [report]

I'm a huge shill of the gigabyte 990FXA-UD3 but I've never seen FX shilling this hard myself, and I like the 8350.

>> No.68445622

vishera != bulldozer

>> No.68445629

>That single slot card
Oh shit nigger is that a Radeon 7750/7770? As far as motherboards, my absolute favorite was the Asus 990FX sabertooth

>> No.68445638

(You) mad cose retard and reject it

>> No.68445651

It is indeed the based Radeon HD7750, That fucking Sabertooth R2.0 has an even better VRM than what's on the UD3, pretty much a watered down ROG board.

>> No.68445684

I propose all people in that chart are retarded.
Smart people moved from Q6600/Q8400 to Ryzen 1700.

>> No.68445708

>Mfw killed my sabertooth because I forgot to re-enable LLC when pushing for a 5.6GHz overclock in my FX-9370
>Mfw the motherboard auto-boosted to 2.7v and kept it there for a full 25 minutes of prime95 stress testing before finally blowing and releasing all the magic smoke

>> No.68445711

Go back to India, Pajeet
Holy shit (in the street)

>> No.68445717
File: 1.61 MB, 1920x1080, 1541803045055.png [View same] [iqdb] [saucenao] [google] [report]

Had me do depressed I forgot my dam face

>> No.68445720
File: 771 KB, 4771x859, 1525127970971.jpg [View same] [iqdb] [saucenao] [google] [report]

The FX series was worse than the Phenom IIs at launch...

A stock non overclocked 2500k is better than any FX cpu in gaming still to this day.
A 2600k is still good enough for any GPU up to a v56/1070 ti

The FX chips have been a bottleneck since the 7950 came out in most games

>> No.68445724

This is some advanced shitposting.
Not even mad desu

>> No.68445732

>being stable for 25 min over 5GHz
Worth it.


>> No.68445752

ask the graphic settings. maybe spent gajillion hours how to lower the graphics as possible.

>> No.68445754

I support intel cause I'm a jew, fuck you all and die for me goy.

>> No.68445762
File: 1.99 MB, 4000x3000, 08012016121.jpg [View same] [iqdb] [saucenao] [google] [report]

Nah lowering graphics won't help you if it's bottlenecked, I remember when I had my Q6600 on GTA V, it didn't matter if I had it on the lowest settings or high, it was getting 40 fps.

>> No.68445787

I used an fx8320 for so long running @4ghz on a hyper 212 evo, it handled everything i threw at it, didn't upgrade to ryzen until last month when I managed to get a zotac amp! gtx 1070 for £200 unopened.

>> No.68445791

(You) mad cose retard and reject it.

>> No.68445800

>$260 usd for a 1070
That's a still a sick card even with RTX out Anon, nice.

>> No.68445814

tell me about it, jumped up from a zotac gtx 760 blower style. Actually bought a 3GB 1060 but the card was fucked (strange electronic interference noise, very different to coil whine) but after RMA'ing the card I got money back and saw it listed for local pickup in my area, the dude already had a 1080ti and didn't need it

>> No.68445823

should mention this is when i first noticed the bottle neck, lower end maxwell cards seem to run perfectly fine with FX processors, I was getting ~100 fps on overwatch/fortnite/csgo/rainbow six/ wow (outside of raids) using mix of medium + high settings. Was a workhorse for about 6 years.

>> No.68445859

Pentium 2 is pretty much identical to you hitting yourself in the dick with a hammer

>> No.68445869

Sell that and get a r3 1200
Yes, Phenom II x4 is better than bulldozer and vishera in single threaded applications.

>> No.68445881

kinda sad. i have a 6850hd, 6870hd, 7770hd, and 7730, but not a 7750
all would be for sale too, and would gladly get rid of them for waaaaay undedr 100.

>> No.68445890
File: 500 KB, 1080x1920, 15 of them now.png [View same] [iqdb] [saucenao] [google] [report]

based fx poster
nope but this is

>> No.68445903

It technically is 5% better at the same clock, but a Vishera core will do 4GHz at the same 1.35v a Phenom II X4 965 does 3.4GHz at, and the FX-8350 has the other core in each of it's modules to kick in when you're doing something multithreaded.

It's cool Anon, my HIS card in the box is cool to have and it's actually an AMD reference 7750 pcb

>> No.68446069


t. also ex FX-6100 owner

because its missing 3d-now instruction

3DNow! Extensions

without this instruction, games fucked up and it affected FPS.

and many instructions of the last gen it was s tupid idea from the begining.

>> No.68446082

Personally I have a water cooled overclocked AMD Ryzen 3 1300X.

>> No.68446209

considering that the backend and the front end of bulldozer is literally on zen cores then yes they will see an indirect boost on games due the the fact that engines will be optimised for zen

>> No.68446703
File: 34 KB, 387x492, UD3 one core per cu.png [View same] [iqdb] [saucenao] [google] [report]

>Have to reboot
>why not bench one core per cu mode
>It's not bad

>> No.68448860

Coffee Lake is more identical to Pentium III than Zen is to Bulldozer lmao

>> No.68449028
File: 80 KB, 720x480, faggot.gif [View same] [iqdb] [saucenao] [google] [report]


you cant gaslight a old guy that dont gives a fuck

>> No.68449076
File: 22 KB, 400x400, thats wrong.jpg [View same] [iqdb] [saucenao] [google] [report]


good luck with only 2 mb shared cache


>> No.68450768

This, also its a 45nm process that can't clock as well an no sse4.2

>> No.68451992

Brainlet here.
What is CPU cache and why is it important?

>> No.68452055

Jesus fuck did I just read something posted by a grade schooler?

>> No.68452254

Worth using if you got the cpu+motherboard+ram for free, otherwise it's garbage.

>> No.68452257

I think the Phenom II>FX meme needs to quit because a phenom X6 uses 1.5V to get the same performance my FX8350 gets with 1.38v, and everyone knows AMD cores at 1.5 are hot as hell.

>> No.68453202

It helps the CPU access frequently used instructions, I think.

>> No.68453215

Basically this, RAM is like your CPU's tool belt and the cache is pretty much what you have in your hands.

>> No.68453248

8 year old benchmark.

>> No.68453311

Funny how the precious 2500K is actually a bigger bottleneck than an 8350 nowadays if it takes advantage of 8 threads in the four modules.

>> No.68454046
File: 16 KB, 288x288, I KNOW I'm cynical.jpg [View same] [iqdb] [saucenao] [google] [report]

>i5-2500K bigger bottleneck than 8350
You're a fucking brainlet.

>> No.68454157

In an ideal situation where the game is modern, it is better, yes 2500kstill has a single core crown, but it's starting to not matter as much, otherwise intel themselves wouldn't have brought 6 and 8 cores to the mainstream socket.

>> No.68454307

even in multithreaded games an 8350 can't keep up to a 2500k. look at any comparison video on youtube.

>> No.68455942

Phenom II wasn't bad for a 42nm cpu. FX was pretty fucking horrible though.

>> No.68456025

It's horrible at single core gaymen, but for mulithreaded workstation use, it's the most amazing price to perf I have ever seen in my life, that's why it's so hard to recommend it to gamers, There is a very specific person who can really use an FX chip, which is really an opteron server CPU clocked sky high, but they have the budget to afford X79 at the time FX came out.

TLDR- There's not many poorfag hobby video editors to make FX 8 cores sell well

>> No.68456070

>it's the most amazing price to perf I have ever seen in my life
Only after the huge price drops, but even then it was has to recommend for anything when haswell/refresh/haswell-e released, because at that point more powerful cpus with even beter price/performance ratios where available on the secondary market, and new on retail shelves.

>> No.68456079


>> No.68456120

X58 and hexacore xeon is a really good used build even to this day, but if you wanted new parts in 2016, a $100 990FX UD3 board, 16 gigs of ram for 89.99 was a really good option of you wanted a new computer that isn't an oem workstation, that can be overclocked a bit, and still didn't cost mutch.

>> No.68456188

Have you checked a single benchmark before make a claim like this? It gets destroyed in every single game, it's not even remotely close. FX was a useless piece of shit, still is and always will be, no amount of muh optimization and whatever retarded bullshit you're parroting can change that.

>> No.68456494

1. at lanch AMD FX was enought for all games
2. now AMD FX is enought for all games
1. at lanch intel 2600 was over enought for all games
2. now intel 2600 is shittier 2 core corelet freesing in all new games
buyers of intel 2600/2500 retarded

>> No.68456515

1. learn proper english

2. i7 2600k runs circles around any FX cpu in games

>> No.68456618

You can get a UD3 rev2 for $150-180 now and 5660s are cheaper than when the 5650 was a popular re-buy.

Granted I'd still rather go for a Ryzen system today, but when I slapped together my 1366 machine it was legitimately the best bang for the buck you could hope to have for years.

>> No.68456683
File: 1.06 MB, 1913x925, price gouging.png [View same] [iqdb] [saucenao] [google] [report]

>One on the left isn't even new with worn mounting holes and claiming it's new
I want to slap the shit out of people that price gouging the UD3 boards, but yes LGA 1366 is an amazing HEDT platform to do a used build with because 990FX/P67.Z68 are pretty much dead due to board pricing.

>> No.68456723

Sandy Bridge sadly bulldozed and continues to bulldoze Bulldozer. Stop kidding yourself.

>> No.68458041

>buying ancient as fuck hardware

>> No.68458956
File: 158 KB, 781x460, Faildozer.png [View same] [iqdb] [saucenao] [google] [report]

>the year is 2000+18
>AMD fans are still defending Bulldozer
What the hell is wrong with you people?
Is this some sort of Stockholm syndrome?
Ryzen exists now and is cheap and preforms well. You don't have to defend your shitty bulldozer purchase anymore.
>FX aged better than Sandy Bridge
Even in heavy workloads, bulldozer still loses to the 2600k.
In gaymes the faildozer is upwards of 30% slower.

>> No.68459015

This. I'm an AMDrone but the Bulldozer was shit. Phenom 2 was more impressive

>> No.68459020


>> No.68459158

There's a huge difference in CPU heavy games like Battlefield 1/V and GTA V.

>> No.68459417

Honestly AMD should have just improved the K10 architecture and made their next line of CPUs Phenom III. They would have been better off.

>> No.68459475

They could have, but they didn't put the time or money into it when they needed to. The company was falling apart internally in the mid 2000s. Right when their sales were strongest the management was the worst, and that caused a lot of their talent to leave for greener pastures.
The Bobcat core evolved into the Jaguar core, and that performs better than Phenom clock per clock. It started out as a small low power design, not targeting high performance, but it surpassed old high performance arch all the same.
A wider Jaguar targeting higher clocks could have been a beast, and still low power, if ported below 28nm bulk.
I still wonder what changes they made to it in the Xboner X.

>> No.68459589

>at lanch AMD FX was enought for all games
It wasn't though.
You couldn't play arma/ofp, starcraft, warcraft, flight sims, and a lot of other series that where available, and popular during that time (excluding trine).
Even hardcore AMD shills like logan, and wendell just used intel with escuses for why they needed to while shilling for AMD.

>> No.68459811

>paying someone else to ass fuck you with DDR4 prices

>> No.68459823

If Intel didn't pay OEMs such as Dell and HP hundreds of millions of dollars to not use AMD CPUs in any of their products back during the Pentium 4 era, AMD wouldn't have been in such a terrible situation going into the late 2000s. The Athlon 64 CPUs should have built an empire for AMD but instead terrible Pentium 4s were shoved down everyone's throats.

>> No.68459851

>right when sales were strongest management was the worst

Let that be a lesson because it applies to almost every situation human beings create for themselves.
It's like a poor person winning millions of dollars and going broke in 5 years. Or the USA electing Trump just in time for him to claim the peak of a natural recovery is his feat of prowess. It's like a small business buying out their franchisee contract only to overpay an under-qualified management team.

>> No.68459889

Don't bring politics into this.
Ur gonna derail the thread.

>> No.68459983

>The Athlon 64 CPUs should have built an empire for AMD
The world should have moved on to 64bit computing instead of staying on x86 with a 64 bit extension.
My opinion is that amd actually held software, and computing back as a whole, because of that extension.

>> No.68459984

super budget only if your mom's paying the power bill.

>> No.68459993

but...HP is selling tons of workstations with ryzen cpus now!

>> No.68460014

Your only singular alternative would have been intel's Itanium and impossible to use in practice compilers. Performance promised never would have been realized by the end user, and you'd be trapped in a single party hell.
X86 offered performance that no one else could touch in that era.

>> No.68460031

>Your only singular alternative would have been intel's Itanium
or power, mips, and etc.

>> No.68460056

>X86 offered performance that no one else could touch in that era.
Not even true.

>> No.68460057

PowerPC got close if you were willing to ditch x86 compatibility.

>> No.68460066

I don't even like macs but I'm thinking wabout buying an air cooled Dual G5 2.5 or a Quad 2.5 with that stupid watercooler because PPC is awesome.

>> No.68460092

LOL, get a clue kid.
Go on Anandtech and look up benches comparing A G5 Mac to a C2D or AMD Opteron system. PPC was fucking dead in the water when multicore X86 came into existence. IBM didn't have a viable big POWER for fucking years.
MIPS? From what vendor? You actually think MIPS could compete with X86 in 2005? LOL
ARM development was nowhere to be found then, unless you wanted a microcontroller, or a desktop with the performance of 1995 in 2005.

There was no performance competitive alternative to X86 in the mid 2000s. None. Intel themselves thought they could make Itanium work. They were banking on it. They wanted to let X86 die so they could have sole monopolistic control of the high performance segment as no one was even remotely close to matching X86.

Again, that is a laughably fucking NO

>> No.68460097

RISC based cpus completely buttfucked x86 during that era. This is why they were used in high end workstations from intergraph, and sgi.

>> No.68460110

Yet another underage kid talking out of his ass, totaly blown the fuck out by actual benchmarks.
The dual socket G5 Power Mac could not compete with a C2D system. Apple knew this before the chips ever launched which is why they cut IBM off as a supplier while IBM was still trying to develop derivatives of the G5 with more cores, higher clocks, and on different nodes.
PPC 100% could not compete.

>> No.68460113

it takes time to access data stored in your computer. HDD/SSD are the slowest to access, RAM is a lot faster, and cache is even faster than RAM. Modern CPUs are so fast that it can take tens or hundreds of instructions to access RAM, so super-fast memory spaces are extremely important.

>> No.68460176

Last G5 Power Mac was released nearly a year before the first desktop C2D chip was released, it competed well enough with intel's efforts at the time.

>> No.68460181

You're so fucking retarded. The reason that the ppc cpus that apple decided to use couldn't keep up was due to cpu frequencies of the designs that apple was choosing from. They line that they were interested in was not able to hit over 3ghz, and power cpus were better than x86-64 at 64bit processing (every 64bit architecture was), but it was shit at 32bit software, because like all purely 64bit cpus had to emulate 32bit since it wasn't native.
Also you're about 5 years ahead of the subject you are talking about.
So not only are you completely misinformed about the subject you are attempting to talk about, but you are also retarded, because you aren't even in the time frame of the subject that you are attempting to show knowledge in.

>> No.68460197

Then you compare prices and realize the FX is a better deal

>> No.68460214

>FX was shit at everything, but the price was low, so it was good even though it couldn't do anything that you wanted to.

>> No.68460223

PPC64's 32 bit support is native, no emulation. Arguably even more native than x86_64's x86 support is, no need for a 64 bit kernel to run 32 bit programs or use 64 bit hardware address space, Apple never even bothered to do a PPC 64 bit kernel, waiting until Mountain Lion to do it on x86_64 only.

>> No.68460229

Enterprise and huge OEM clients like Apple don't wait on consumer availability. Apple had test chips from intel, they saw the performance IBM was getting with new G5 prototypes, and Apple abandoned PPC because it wasn't competitive. Higher clocks, a new node shrink, none of that was enough to keep the G5 PPC arch or any derivative competitive.

>I'm an underage faggot hipster who hate on X86 while know literally nothing about this topic
>RISC so cool!
>lmao MIPS desktop in 2005 would have been so gud XD

Development of PPC didn't stall internally at IBM, you dumb little kid. IBM kept producing new chips for Apple. In 2005 Apple already had a viable replacement from intel, and it was so compelling that they switched all of their software over to X86, and that is not an investment a company makes light heartedly.
Guess what year AMD64 launched?
Guess what year multi core X86 took off.
Oh, thats right. You're a dumb kid who wasn't even alive during this era. You just want to be a contrarian neckbeard and pretend like obscure trash is always better than the mainstream. Go over it, little kid.

There was no performance competitive alternative to X86. AMD creating the AMD64 extension saved the entire market from a dystopian hell where performance would have regressed for everyone across the board, or everyone would have kept using years old hardware as they have no platform to upgrade to.
You couldn't find an ARM system that would compete with an Opteron. There was no MIPS implementation in existence that had serial performance on par with X86.

There was no alternative, and you're talking out of your ass, kid.

>> No.68460239

No shit. We are talking about 64bit you retarded mong. Just drink bleach already.
Are you legit retarded?

>> No.68460252

I am talking about 64 bit. PPC64 is just as much of a hybrid as x86_64 is.

>> No.68460263

Look at the image I replied to, you pay 2x for the i5 but get 1.5x the frame rate. Do you really think that's bad?

>> No.68460266

Apple probably could've stuck with G5 line for a couple years longer before IBM was way too far behind to have any hope of catching up to Core, the much more pressing need to switch was how ancient the G4s were getting and IBM not willing/being able to bring G5s down to anywhere near sensible laptop TDP.

>> No.68460271

Shift from apple, and into power3. Apple isn't a part of this conversation when talking about 64bit IBM architectures.
>Guess what year AMD64 launched?
AMD publicly disclosed it during 1999 which is what fucked up pure 64bit cpus for the next decade since it was cheaper for most/all companies to stay with their current 32bit software instead of upgrading to 64bit.

>> No.68460295

>Apple probably could've stuck with G5
They didn't want to, because IBM cpus had a premium over other manufacturers. Hence apple would buy low end ibm, and use the confusion to claim that their cpu choices were more powerful than every one elses.

>> No.68460313

What if your goal was beyond the capabilities of the fx cpu. Would it still be a good choice for the price, or a waste of money?
Like would you buy a 5ft ladder that was on sale to look over a 25ft wall?

>> No.68460326


This is kind of a pointless discussion if we don't have serious information like the exact alleged FX processor model, cooling solution (Many higher-end FX processors came with a 120mm liquid cooler), and frequencies of the CPU and GPU. Ask him to take screenshots and post that info.

>> No.68460329

PowerPC 970MP aka G5 was a POWER4, how is that not 64 bit IBM architecture.

>> No.68460364

Agreed, I'm a big AMD fan but even I can admit that the FX line was absolute garbage

>> No.68460379

7 years later, AMD drones are STILL using the same bad arguments to defend Bulldozer.
That same argument actually works with ryzen as it's only 7% behind for 50% less money.

>Like would you buy a 5ft ladder that was on sale to look over a 25ft wall?
Top kek

>> No.68460401

How is it relevant to the discussion?

>> No.68460406

A dual socket G5 was drawing more power than a quad socket C2D, and offering worse performance than a dual socket C2D.
Even earlier, in June 2005, using systems that had been on the market longer, Anandtech had another comparison between the G5 Power Mac and two X86 systems.

The G5 was one of the best workstation processors available, and it was falling hard behind X86 from both AMD and intel by late 2004 already. IBM tried to push clocks as high as possible, they wanted to increase caches, make monolithic dual core dies, potentially quad core. They would have hit their original promised clock targets eventually, but it just would not have mattered.

>> No.68460439

Why are you even in the new millenium, 2005 even?

>> No.68460447

>but the price was low
Pretty much, have you seen the multi core of an 8350 on cinebench? oh my god.

>> No.68460451

timeline for you.

>> No.68460475

>the multi core of an 8350
Which was as useful as trying to climb up 8 different short ladders to see over a single tall wall in a world full of tall walls.

>> No.68460483

your friend is a lying retard
and thats coming from soenone who actually likes vishera

>> No.68460504

Why are you such an under age retard?
Look here, junior: >>68459823
then here: >>68459983

This chain started because someone brought up the Athlon64, a fucking CPU line first launched in Q4 2003. One shit eating retarded little hipster was trying to argue that AMD was detrimental to the market, holding it back, by keeping X86 relevant.

The facts of the matter are that nothing could out compete X86 in this era. The best workstation CPU from IBM who at the time was the highest performance CPU manufacturer around, totally dominating the entire super computer market, they were failing to outpace X86.
There were no ARM or MIPs rivals to IBM's offerings in the early to mid 2000s. The only thing to match and then out compute IBM's PPC was X86.
Without X86 you would have had fucking itanium. There literally was nothing else.

Only contrarian little kids would ever try to argue otherwise. The computing market would have suffered immense stagnation without AMD64.

>> No.68460527
File: 374 KB, 1920x1200, quad-balls.jpg [View same] [iqdb] [saucenao] [google] [report]

damn right they're awesome

>> No.68460554
File: 3.65 MB, 2249x3119, FXCPU_Die.jpg [View same] [iqdb] [saucenao] [google] [report]

>The computing market would have suffered immense stagnation without AMD64
This, I think AMd had done the same kind of think with FX. everybody was posting moar cores memes, but now all the mainstream chips are having moar cores, because ipc is so stagnant, it's the only way to be faster.

>Leopard on a big ass 1920x1200 screen
Pretty much my dream setup as a 12yo boy when leopard was hot.

>> No.68460589

>a fucking CPU line first launched in Q4 2003
The extension was released in 1999, and 64bit cpus existed then (as they were being used to develop everything worth buying, because x86 could not keep up).
All things being equal RISC cpus are more powerful, but manufacturing is not equal, and ibm, and other risc manufacturers couldn't keep up when considering price during the mid 2000's especially since the release of x86-64bit extension have already been adopted by intel, and amd. The era that you are talking about is already past the decline of risc cpu manufacturing for low end units as well as pure 64bit cpus. You're arguments are based on the results of 64bit architectures losing the war to amd's 32/64bit extension, and not when they were fighting for adaption.

>> No.68460628

>The computing market would have suffered immense stagnation without AMD64.
You're dumb. What happened was:
>the world clung to 32bit instead of progressing into 64bit computing
Devs barely started to go into 64bit software recently instead of everyone adopting 64 bit in the year 2k. Stagnation literally happened, because of AMD64.

>> No.68460634

I can run fortain at 640x480 50% downscaled very low settings at 800 fps too

>> No.68460638

We're now in an era where a mainstream CPU family is topped by affordable 8c/16t CPUs. Where prosumer level hardware offers quad channel and 32c/64t systems, all while still maintaining high IPC, and moderate clocks to boot.
No strange compilers required.
No exotic single vendor lock in.
No proprietary memory standards.

I fucking shudder to think of what the world would be like if this contrarian chuds had their fantasies come true.
>muh 800mhz MIPS with $500 single channel proprietary RAMBUS dildo tech is the best thing ever
>I can put animu girls in my terminal window, and its even TRUE 64bit
>no one needs a graphical UI anyway

The AMD64 ISA extension didn't just pop into existence and cause developers of other ISAs to stop in their tracks. Everyone working on an ARM design didn't see a news headline then stop coming to work.
IBM was top dog, and their next best competitor prior to X86's dominance was a far distance second place. The argument that there were alternative to X86 is entirely fallacious no matter how you cut it. IBM's PPC was top of the market, and they were the tallest tree on the hill. There were no ARM or MIPs chips competing head to head with them.
Without X86, you would have had Itanium, and Itanium was awful. That would have been the alternative to IBM's stagnating PPC family.

>> No.68460661

the used prices on ebay show the 2500k as cheaper than the fx 8350

>> No.68460698

The 64bit extension was adopted because companies did not want to upgrade to 64 bit software entirely. It's why it won out. It didn't perform better than anything else for 64bit, but it didn't have an overhead when running 32bit software, so it became more attractive to everyone at the time. Was it the right choice for companies back the, yes, sure. . . but the result also stagnated progress.
>Without X86, you would have had Itanium, and Itanium was awful.
Itanium was good at 64bit software, but itanium was not the only 64bit architecture in the game back then, and the higher end risc based cpus where in fact more capable back then.

>> No.68460707

>the used prices on ebay show the 2500k as cheaper than the fx 8350
Early FX adopters btfoed again.

>> No.68460898

>People are being niggers on eBay so it's AMD's fault even though the price to performance of Vishera was bretty good

>> No.68461128

Itanium wasn't better at 64 bit software than x86_64 was though.

>> No.68461366

Based GameCube poster

>> No.68461416

>Itanium wasn't better at 64 bit software than x86_64 was though.
Yes it was. Itanium was shit at 32bit software though.

>> No.68461575

I played fortnite on my FX8320 a while ago and had similar fps, however it dropped down under 60fps when there was 20-30 people in one city.
It's good cpu but not for newer games, i never got it under 60fps except on bf1.

>> No.68461770

If your goal is beyond the capabilities of an fx cpu you get something more powerful than an fx cpu; if your goal is to maximize performance and minimize costs, an fx becomes the best choice.
I don't understand your post, does that argument not apply to fx cpus as well?

>> No.68461941

>If your goal is beyond the capabilities of an fx cpu you get something more powerful than an fx cpu;
Good argument against fx cpus. No point in buying them since they aren't capable of doing anything other than shitting themselves.
>if your goal is to maximize performance and minimize costs, an fx becomes the best choice.
It never was the best choice. It a cpu that is outperformed by other cheaper, and available cpus, so there literally is no point in ever purchasing one.
What use case is there that makes an fx cpu "the proper choice" at the price ranges that you could purchase, and build an fx system?

>> No.68461968

I'm cozy with my FX-6300, I'm probably not gonna upgrade until like Zen2+/Zen3. Before that I may get some cheap second hand FX 83xx

>> No.68461985

>Before that I may get some cheap second hand FX 83xx
Just stick with your 6300. The "upgrade" isn't worth it. Might as well buy better ram, or more storage, or memory card front panel if you want an upgrade.

>> No.68462002

Got plenty ram, plenty storage. If I ebay'd the 6300 I could have 8300/8320 for literally shipping cost if lucky.

>> No.68462005

a new keyboard, monitor, mouse, or even using the money to watch a movie would be a better upgrade.
Fuck, buy a tasty burger instead of "upgrading" your cpu to an 8 core fx.

>> No.68462015

I have an fx 83xx that I don't use. How much would you pay for it?

>> No.68462018

You should upgrade to a Phenom II X6 1100T.

>> No.68462022

Though if you set it in bios to run 1 core per module, it's a really decent quad-core

>> No.68462026

FX apologists are fucking worst.

>> No.68462037

>Though if you set it in bios to run 1 core per module
My motherboard won't do this. I could only shut off modules in pairs, so running 4 core is just 2 modules.

>> No.68462060

Also OS schedulers have optimized for FX modules, so running 1 core per module wouldn't do anything outside of turning off cores.

>> No.68462081

Not really. I have bought a Phenom X6 for a lowend PC (Sub 100 €) for my boyfriend. It is acceptable for the lowend but I doubt it will compete any current CPU.

My 2700X gets around 70 - 75 % better singlecore score, OC vs OC.
Phenom was the last really worthy AMD CPU before Ryzen, you could get the X6 for about 100 € new in 2011.

>> No.68462094

Got him exactly this card, I doubt you can get a better deal than a 7970 for 50 €. He is now able to run most games maxed and current games at medium settings in 1080p.

>> No.68462117

Was a good card, but I had to replace mine, because it was just getting to hot when trying to do even the simplest shit like watching videos on mpc/madvr.

>> No.68462189

How? It runs at 30° here. The gaming performance is still great considering its age. HD 7870 is currentl usually the minimum.

>> No.68462198

The non-Bulldozer FX processors (Piledriver, etc.) are alright, but the Bulldozer ones were indeed dogshit.

>> No.68462223
File: 29 KB, 404x402, memepc.png [View same] [iqdb] [saucenao] [google] [report]

>> No.68462251
File: 470 KB, 421x702, fine-wine.png [View same] [iqdb] [saucenao] [google] [report]

>> No.68462256

I don't know. I guess mpc+madvr+svp gets pretty demanding.
> The gaming performance is still great considering its age.
hd7970 had a good run, but it definately has showed it's age even at 1080p/60fps at least mine did.

>> No.68462269

HD7970 aged like fine wine the fx series not so much. It was just shit to begin with, and all the optimizations that can after really didn't do much for it.

>> No.68462272


nah its terrible, you have to be on a new level of deluded fanboyism to think bulldozer was ever a good purchase.

even at new pricing the difference between a 2500k and fx-8320 was like $50 at most, which is a few hours work at a mcjob.

>> No.68462394

>my 900ghz FX was better than ryzen for 2fps!
ofc shitheads just OC ryzen, my r2600 runnin at 4.5ghz it just rapes and nut inside my old fx6300

The good thing about FX is that they're cheap, and ddr3 is cheap also, got an entire fx build (mobo+8350+16gb ddr3) for $100 used.

>> No.68462407

I have such a friend myself and I came to a conclusion he's lying.

>> No.68462754

I wanted to buy a second hand cpu more than a year ago. coming from a phenom II and not liking intel practices i asked a friend if the fx series was good, he told me the phenom 8350 was good and everything runs fine on it. He owned one himself but only for less than 2 year around when it was released.
Now im stuck with my 8370, dont listen to your friends.

>> No.68462780

Even worse another friend who knows jack shit about pc asked me 2 years ago to build him a rig. I was lazy as fuck i didnt follow the cpu market for a good time so i asked my other friend which cpu to buy, he told me a fx 8350...
Thanks god my friend is still a tech retard.

>> No.68463067

The game is not well optimized to use your system resources, try FFXV or MHW at the same settings than your friend to see the difference.

>> No.68463120

>Is Bulldozer better than it seems
>he gets insane framerates in video games.
He doesn't. At best he'll see those frame rates when there is nothing to stress his system. Like when he's alone in game standing still while looking at the ground, or paused, or in a game menu.
>Is he lying about his setup,
First time meeting an amd owner?

>> No.68463230

He probably lowers his game settings. But not having both systems to test it is hard to say what is going on. A 2700X is far aand away faster than Bulldozer. So something is either wrong with your setup or something is different on his. Or a combination of both. Maybe he is lying about what's inside his system or how it's setup.

>> No.68463347

Ah. Just read that Bulldozer can overclock higher on single core. That may be why. Intel runs games slightly faster than Ryzen because of higher clocks. I never did go the AMD route so do not know much about their older processor line.

>> No.68464503

>this thread
ITT: Retards

>> No.68464529

you are in

>> No.68464555

fugg D:

Name (leave empty)
Comment (leave empty)
Password [?]Password used for file deletion.