[ 3 / biz / cgl / ck / diy / fa / g / ic / jp / lit / sci / tg / vr / vt ] [ index / top / reports / report a bug ] [ 4plebs / archived.moe / rbt ]

Due to resource constraints, /g/ and /tg/ will no longer be archived or available. Other archivers continue to archive these boards.Become a Patron!

/g/ - Technology

View post   

[ Toggle deleted replies ]
File: 893 KB, 1018x575, RIP.png [View same] [iqdb] [saucenao] [google] [report]
71364467 No.71364467 [Reply] [Original] [archived.moe] [rbt]

>> No.71364698

>Slow preset
Literally nobody uses that, It's called slow for a reason

>> No.71364727

thats why they all have either gigabit upload or shit quality stream

>> No.71364752

>Literally nobody uses that, It's called slow for a reason

Well now they do, stay mad Incuck.

>> No.71364944

AMD in a nutshell

>> No.71364998
File: 22 KB, 331x318, 2356727.jpg [View same] [iqdb] [saucenao] [google] [report]


>> No.71365073

Well done AMD, well done.
streaming is encoded through your RTX GPU now

>> No.71365329

gpu encoding has always been and will always be shit. just compare 5 Mbps NVENC to 5 Mbps x264, and you'll see

>> No.71365382


>> No.71365419

Imagine being a 480p streamer using NVENC.

>> No.71365503

So let me get this straight...if you're a streamer, go ahead and pay $1200 for a new x570 motherboard, $200 for the fast ram you need to make Ryzen decent, and the $500 CPU. Everyone else can spend a fraction of the price on the 9900KS which has guaranteed 5.0ghz clocks. K.

>> No.71365505


>> No.71365523

What does it say behind mommy? What codec is this?

>> No.71365536
File: 36 KB, 707x299, Are+you+telling+the+truth+i+tried+googling+it+and+_d03b93ad060be2ad6a4ffe70bed0545f.jpg [View same] [iqdb] [saucenao] [google] [report]

>pay $1200 for a new x570 motherboard

>> No.71365541

Nobody streams on GPU except clinically retarded shitters.

>> No.71365584


People buy 7980XE just to stream and game because on GPU it looks like shit

>> No.71365594

lol what. Twitch has 6mb max limit.

>> No.71365611
File: 461 KB, 1920x1080, .jpg [View same] [iqdb] [saucenao] [google] [report]


>> No.71365621

intel cant make good bait

>> No.71365622

Intel shitters should be exterminated together with their retarded families for raising them like this lying subhuman.

Prove me wrong.

>> No.71365626

I bought my 9900k like 5 months ago at this point

>> No.71365687

>turn it to fast preset

>> No.71365699

Literally anti-semitic.

>> No.71365707


>> No.71365720

>pay 100$ for x470 motherboard because i dont need NVME raid

>> No.71365739

you know slow preset means more cpu usage right? if you use it on fast it's supposed to tax the cpu less you fucking brainlet

>> No.71365746
File: 185 KB, 1280x960, soyface.jpg [View same] [iqdb] [saucenao] [google] [report]


>> No.71365754

nvenc is fine as long as you don't play any games with fire or waterfalls

>> No.71365775

>nobody uses better quality encoding as it kills cpus
>/g/ then decides nobody should ever use it, even if the hardware becomes available.

>> No.71365861

for the sake of argument lets say the slow preset is 20 times more taxing than the fast preset (which its not)
whats 20 times 1.6 fps

>> No.71365876

Source ?

>> No.71365883

or smoke or fast movement or literally a dozen other things that cause it to turn into a mess of block noise

>> No.71365908

ok? Point is, it's fine.

>> No.71365918

>16 cores

>> No.71365937


>> No.71365994

Source ? Mine says otherwise

>> No.71365995

Still no 5ghz lmao

>> No.71365997

the people that literally buy another computer just to stream
so every popular streamer ever

maybe for you, most people would prefer their shit to look better than a 3mb webm on 4chan

>> No.71366105

Still way better than anything Intel has to offer. cope harder.

>> No.71366160

your source is retarded
the default setting for OBS is X.264 veryfast which no one uses

>> No.71366256

Hello time traveler, I too remember your home year of 2013.

The improved NVENC in the RTX series looks effectively identical to medium preset x264

>> No.71366280 [DELETED] 

there is no medium preset for x264
try again when you know what your talking about

>> No.71366363

the only comparison i can find compares and RTX NVEC with h264 veryfast

if you have evidence to the contrary post it

>> No.71366389

4.7GHz is close enough. With that massive IPC improvement it beats Intel's 5GHz.

>> No.71366435
File: 74 KB, 896x489, mpu_AMD_FX_9590_5_GHz_id1371034423_263351.jpg [View same] [iqdb] [saucenao] [google] [report]

>responding to brain dead 5ghz posters with anything but pic related

>> No.71366464

but it doesn't though. They trade blows depending on workloads. It beats it in cinebench clock for clock, but loses in some games for example.

>> No.71366561

>but loses in some games
the same can be said for both of them but only one of them does it at 105w has 12 cores for not game performance and doesn't require a chiller

>> No.71366675

>Get shit compression on fast preset

>> No.71366701

Stay mad eyelet

>> No.71367334


If 4.7 is the OOB guaranteed performance then 5gz is a very realistic overclock.

But seriously niggers, how many of you have 380hz monitors?
CPU gaming performance is so high it's irrelevant to the equation.

I would love one fore my productivity and creative stuff, but for games? If I was just a retarded a gamer who just eats sleeps and games I would just get a 3600.

>> No.71367350

oy vey

>> No.71367374

Quality doesn't matter

>> No.71367478

1.6 FPS
>1.6 FPS
1.6 FPS
>1.6 FPS
1.6 FPS
>1.6 FPS
1.6 FPS
>1.6 FPS
1.6 FPS
>1.6 FPS
1.6 FPS
>1.6 FPS
1.6 FPS
>1.6 FPS
1.6 FPS
>1.6 FPS
1.6 FPS
>1.6 FPS

>> No.71367505

>pay $1200 for a new ... motherboard
AMD board prices are quite a bit lower than Intel's.

>> No.71367529

Not on X570 they aren't

>> No.71367618

But I've had my 1080ti, 8700k and ddr4 4400 cl19 for a long ass time now.
It looks like when I upgrade I'll be able to get a very nice gen 2 navi 64cu hbm monster with a 16c32t 7nm+ 5ghz ryzen 4900 though.

>> No.71367639

>sitting this close to the monitor
>already has glasses
YIKES. intel useres are LITERALLY blind

>> No.71367645

>6 cores
Don't forget to disable HT :-)

>> No.71367848

Considering the average for X470 is $180, we'll probably see them closer to $250. That's not outrageous for nicer boards.

>> No.71367951

The eye can't see over 1.6 fps anyways.

>> No.71367990

wooooah a bar chart intel btfo!!!

>> No.71368880

>If 4.7 is the OOB guaranteed performance then 5gz is a very realistic overclock.
wrong. The 2700X barely overclocks to 4.3GHz all cores.

>> No.71368959


This anon is right >>71368880
Ryzen's boosting scheme already pushes the cores as far as they will go and even pushes voltage to 1.5V with XFR for microboosts. 1.5V is past ASUS's recommended max voltage for OCing, meaning your chip will degrade quickly at that voltage for a 24/7 OC, and P-pstate OCing never worked well. I imagine early gold samples and more mature bins of the 3800x might hit 4.6ghz within safe voltage, with a very small handful hitting 4.7, although at >1.4V it's going to be crazy hot.

3900X and 3950X, I doubt we'll see any high all core OCs. It's way too much heat density for even a water cooler to handle

>> No.71369003

someone post the list so we can add this to it

>> No.71369049

>people actually care about this shit

>> No.71369239


>> No.71369289

Lol I have a Ryzen and I don't use CPU encoding for streaming, I use the new Nvenc.
All my friends use Intel Quick Sync or Nvenc.
Pro streamers all use two systems.

>> No.71369535
File: 152 KB, 400x560, E76DE9EDC3CF4B26B88E66D4FE57AFC3.png [View same] [iqdb] [saucenao] [google] [report]

>Nobody uses a feature because it's too demanding
>AMD makes a CPU that can handle the task with ease
>>W-w-we didn't want that anyways!

>> No.71369724

>So let me get this straight
go back to twitter you retarded shill

>> No.71369772

>software and workloads used in performance tests may have been optimized for performance only on intel microprocessors

>> No.71370142
File: 61 KB, 224x314, transparent debiru.png [View same] [iqdb] [saucenao] [google] [report]

Guess who is using it in 3 months? I can barely get away with fast preset on the 1700, this looks absolutely insane.

>> No.71370181

Not that anon, but it's literally the first result when typing "obs new nvenc"
Disregard the faggot and look at the comparisons, it's fucking great. I'd never use it since Linux AMD for what appears to be life, but it's good to know such an option is not shit anymore. That being said, software encoding will always be better than hardware.

>> No.71370253

>they think streamers build their own PCs
>they think Fortnite zoomers build their own PCs

>> No.71370255
File: 54 KB, 679x758, 1560090428223.jpg [View same] [iqdb] [saucenao] [google] [report]


>> No.71370265
File: 68 KB, 269x195, 1552407706738.png [View same] [iqdb] [saucenao] [google] [report]

it's all they can cling to since everyone elese has literally no reason to pick AyYYYMD over Intel

tehy are defeated and desperate, scary&pathetic

>> No.71370287

>be shroud
>have 2 2080ti
>still uses a dedicated streaming pc because cpu is just plain better

>> No.71370289

>not having a dedicated machine for encoding and streaming

>> No.71370344

wait, is that x264 slow 1080p at 59fps? with a game running?

>> No.71370345

>be shroud
>shit talk AMD
>know nothing of tech

>> No.71370396

>"Quality doesn't matter"
>-- Intel, 2019

>> No.71370419
File: 84 KB, 703x193, cap.png [View same] [iqdb] [saucenao] [google] [report]

sure thing bud

>> No.71370439

>Rs 5000
what's that, runescape gold?

>> No.71370441

MSI's CEO said they'll start at $220. And we already know high end boards go up to $500-600 with some limited edition models up to $999

>> No.71370442
File: 10 KB, 224x250, 1494162098217s.jpg [View same] [iqdb] [saucenao] [google] [report]

>Quality doesn't matter

>> No.71370468

Well, it's not slow anymore :^)

>> No.71370480

how's the weather like in tel aviv

>> No.71370573
File: 151 KB, 519x543, jip.png [View same] [iqdb] [saucenao] [google] [report]

>q-quality doesn't matter anon!!1

>> No.71372029


>> No.71372069


>> No.71372136

Who cares. Tell us how fast it compiles the gentoo stage 3 tarball then we talk.

>> No.71372182


>> No.71372230

Quantum tachyon levels of fast

>> No.71372337
File: 631 KB, 500x493, 1526078534530.gif [View same] [iqdb] [saucenao] [google] [report]

>Believing the results in these kinds of presentations reflect real usage
Ultimate brainlet

>> No.71372375

not great, but not terrible

>> No.71372389

Literally nobody uses that, It's called Ultra for a reason

>> No.71372390
File: 353 KB, 660x923, 74732297_p7.jpg [View same] [iqdb] [saucenao] [google] [report]

That's still 5760FPH.

>> No.71373263

My motherboard is ready

>> No.71373653

60 fps? What do you need 30 fps for? 15 fps should be enough right? Here, your 1.6 fps bro lmoa

>> No.71373682
File: 81 KB, 626x657, intel shill.jpg [View same] [iqdb] [saucenao] [google] [report]


>> No.71373937


>> No.71374163

*Laughs in PCIE 4.0 RAID 0*

>> No.71374233

>Pro streamers all use two systems.

And now you can just use one chiplet for the stream and another for your game. Instead of two separate PCs with all the costs and headaches involved.

Stay mad shintel fag.

>> No.71374254

wtf are you talking about? I compile ffmpeg for slow DEFAULT because it has the best gains to speed ratio of ANY PRESET
fucking retard

>> No.71374315

What is up with all the AMD cocksucking on this site? Both are greedy as fuck companies out for your money, it's just AMD's turn to be behind in market share so ahead in actual quality for your money. Fucking corporate shills the lot of you.

>> No.71374397

It's the numale equivalent of rooting for sports teams.

>> No.71374424

Everyone streams on GPU.

I doubt more then 2-3% of twitch streamers use software encoding.

>> No.71374425
File: 245 KB, 800x612, 1489160516428.png [View same] [iqdb] [saucenao] [google] [report]

Yes fellow /g/ents brand loyalty is really stupid.

>> No.71374435

How is it there, in 2012?

>> No.71374447

>gpu encoding has always been and will always be shit
How is it there, in 2012?

>> No.71374456

>Sucking corporate cock is fine as long as it's MY PREFERRED corporate cock

>> No.71374474

You heard him, stop with this brand loyalty goyim!!!!!!!!!

>> No.71374479

It's true?
Twitch themselves say 4500-6000kbps is the max for 1080p60fps.

Though I know some higher end twitch partner streamers that use 7000-8000, but still 8mbps isn't exactly a massive difference from 6mbps.

>> No.71374489


>> No.71374494

more like the intel 9900 KYS lol

>> No.71374498

It's true that Fortnite streams use pre-builds, however - since Zen's introduction roughly 70+ percent of modern day streamers prefer to use pre-builds with AMD on board.

>> No.71374505

>roughly 70+ percent of modern day streamers prefer to use pre-builds with AMD on board.
What's the source besides your ass?

>> No.71374518


>> No.71374531

I don't give a shit about brand loyalty, but AMD just gave us 12 and 16 core CPUs with vastly lower price tags and better performance compared to Intel.
Hell, they could have sold the 16 core model for 1300$ and it would have still been far better option than Intel, but instead it's going for 750$.
That's fucking amazing and borderline charity in comparison to what the alternatives are. They do deserve a bit of cocksucking for pushing the envelope with the prices and core count. They could have easily asked way more for this lineup.

>> No.71374534

... Again, post the source don't just say things, show the proof

>> No.71374573

Wow, they deserve cocksucking for not being total jews?

Fuck off, they still don't compete in gaming preformance and still have shit latency issues.

Not to mention, Intel doesn't produce a consumer 16 core CPU, comparing it to HEDT and xeons is simply retard, of course workstation CPUs are gonna cost a lot more, just look at threadripper and epyc.

Most people don't need 12 or 16 cores for home use and gaming. They'll see more advantage from higher single core preformance.

At least AMD is competitive again, but they don't particularly deserve praise for FINALLY doing what they should've been doing all along.

Best Intel in ALL aspects, then get your cock sucked.

Just competing simply isn't that impressive.

>> No.71374627

I don't cocksuck companies for doing their job and showing up to work, anon. It's potentially a good product if it pans out like they're hoping it will but that still doesn't justify this /v/-tier console wars faggotry.

>> No.71374688
File: 282 KB, 752x548, 1490573199739.jpg [View same] [iqdb] [saucenao] [google] [report]

>> No.71374730

>less than 2 fps
>not terrible

>> No.71374739
File: 1.06 MB, 380x184, 1560221571.gif [View same] [iqdb] [saucenao] [google] [report]


>> No.71374804

The higher end mobos come with PBO most of the time to push the boost clocks even higher. Most Ryzen chips can undervolt a bit at least. My 2700X runs with 0.1V less then stock and extends the boost to 4.3GHz on all cores with PBO. Don't forget that all chips come with a bit of undervolting headroom to insure stability at stock settings so if you have at least a decently binned sample you can at least get an undervolt going.

If you can run the 3950X with a 0.1V undervolt and enable PBO you should at least be able to push it to 4.8GHz and maybe 4.9 or even 5. So it's pretty likely for at least golden samples to hit 5GHz this way, even if it's for just one core.

However, why the fuck does 5GHz even matter that much? It's only a ~6% increase over 4.7GHz anyway, ~4% over 4.8 and ~2% over 5GHz. Outside of the clock speed the 3950X already is within the margin of error on only a couple of use cases with the top end i9 and beats it in every other metric already.

>> No.71374828

AMD dominating.

>> No.71374835

They're neck and neck with Intel when it comes to gaming. If that's not competing I don't know what is.
>Not to mention, Intel doesn't produce a consumer 16 core CPU, comparing it to HEDT and xeons is simply retard,
So what? The thing is that I'm looking to get a 16 core CPU and what matters to me is the price and performance of such a product.
I couldn't give less of a fuck about what marketing category such a processor falls in, whether it's HEDT or gaming or whatever.
I'm looking to buy one and what matters to me is the price and performance. I'm going to save about 1000€ by getting AMD when I next upgrade and I'm also going to get a better performance in both single and multi core. That's impressive in all aspects.
Also what most people need when it comes to gaming is irrelevant. By making these high core models mainstream they're forcing Intel to compete in this category too, which benefits all of us in the long run.

My point is that they could have asked for way more money for their higher end products.
This isn't just showing up for work and doing their job, this is doing their job with one hell of a discount.
Also occasional shitposting with console wars tribal mentality is just good fun.

>> No.71374846


>> No.71374883

>AMD also said that they ran all of their Intel system tests that they shared today without software and firmware mitigations in place for these security flaws

>> No.71374889

ofc they dont cause you didnt had the ability till now

>> No.71374904

most of the streamers till ryzen had a separate pc just for streaming the conent wtf are you talking about

literally nobody streams via gpu

>> No.71374908

Nobody really buys Intel except rabid fanboys.

>> No.71374920


>> No.71374982

this antisemitism is really unnecessary

>> No.71375009

This is a retarded use case because anyone who is actually serious about streaming has a 2 computer setup with a capture card.

>> No.71375027

Most streamers?

Are you retard?

Most streamers have 0-10 viewers and uses GPU encoding because they can't afford a dedicated streaming machine.

The only people who can are rich retards LARPing that they'll make it big streaming, and established twitch partners.

Hell there are plenty of big streamers that don't use dedicated streaming computers too

>> No.71375038

Reminder, those mitigations have little to no effect on gaming preformance.

>> No.71375040

Only retarded people use 2 computers for streaming in this age.

>> No.71375050

yeah maybe if you started watching streamers now cause 2-3 years ago it was MANDATORY to have a second pc for streaming

and since ryzen came things have gone a lot easier
and no NOBODY USES GPU for streaming

>> No.71375092

add it to the list

>> No.71375193

AMD's winning but where's that "8 core 65W engineering sample" that was supposedly beating 9900k at the Zen 2 reveal? What I can see here is a 12c/24t only keeping up with an 8c intel.

>> No.71375200

>just buy another i9 goy

>> No.71375242

>not using a single-PC setup with a 2700x
>not using process lasso or equivalent to stick your game on one CCX and OBS on the other

>> No.71375246

>1.6 FPS

>> No.71375255
File: 926 KB, 1080x3932, Screenshot_20190611-090541_Samsung Internet.jpg [View same] [iqdb] [saucenao] [google] [report]

>and no NOBODY USES GPU for streaming
Confirmed retard
This steamer has a bit over 5000 subscribers and uses 720p60fps GPU encoding.

>> No.71375280

>59 fps
unwatchable, you need a solid frame rate, try again in 4th gen

>> No.71375287 [DELETED] 
File: 2.14 MB, 400x548, 1533270254222.gif [View same] [iqdb] [saucenao] [google] [report]



>Ryzen 3000 Gaming Comparisons With Intel: Performed Without Security Mitigations for Intel And Without Windows 10 May 2019 Update Ryzen Scheduler Enhancements



>> No.71375313

What a retard lmoa

>> No.71375327

he mad

>> No.71375335
File: 168 KB, 1021x503, 1547051816101.jpg [View same] [iqdb] [saucenao] [google] [report]


>> No.71375340

The whole point of that demo was that the game pegged all of the 9900K's available threads but the 3900X apparently had enough threads to spare to do a 60fps stream encode at the slow preset. This is likely going to be a theme for all the reviews. The CPU will just barely match the 9900K and will frequently be beaten if the game only runs on like 2 threads. The more multithreaded the game, the better the performance on the 3900x.

>> No.71375345
File: 573 KB, 1280x720, 5ghzdipper.png [View same] [iqdb] [saucenao] [google] [report]

AMD had 5Ghz before Intel, cuck.

>> No.71375361
File: 63 KB, 983x768, 983px-AmdahlsLaw.svg.png [View same] [iqdb] [saucenao] [google] [report]

>Most people don't need 12 or 16 cores for home use and gaming. They'll see more advantage from higher single core preformance.
this. having that many cores is overkill for normal users. that's why amd is showing contrived benchmarks like cinebench, blender and streaming.

>> No.71375366

Yeah, video quality to bitrate ratio doesn't matter goy

>> No.71375373


>> No.71375379

every year games get more multithreaded, retard

>> No.71375385

how can you be this stupid to post something like this. thank God this site is anonymous.

>> No.71375396

And out earning you every single month while doing it.

He only streams 5 days a week too. Not some cuck steamer who streams literally 7 days a week to keep subscribers from leaving.

>> No.71375410

It doesn't when the streaming platform itself is reencoding with ASICs (just like your GPU encoder) and limiting bit rate to sub 10mbps no matter what

>> No.71375412

No amount of money will fix your smooth brain kek

>> No.71375423

It's true in every way, you're the retard if you honestly think every big steamer has a dedicated streaming rig.

>> No.71375450

When you're making $12,000+ a month before bits and cash donations, I really dont think I'd care what some poorfag thinks is "smooth brain"

>> No.71375461

you don't need top of the line equipment to get started with streaming. try getting some subscribers and revenue before you think about early adopting such an overkill cpu.

>> No.71375479

Still dumb, ugly, fat and alone kek

>> No.71375537

A lot of people are limited by their upload bandwidht, now with a cpu like this more people can afford a decent stream quality even if they live in some countryside shithole and can't get good network speeds

>> No.71375612

>Tying the fastest gaming CPU in the world in gaming performance is a failure to compete in gaming.
Oh, I guess everyone will have to use the gaming CPU that's significantly faster than a 9900k then...oh wait.

Fuck off Intel shill.

>> No.71375652

t is the best quality one. Nobody uses it because no mainstream CPU could do it.

>> No.71375696

He's engaged and bought a house

>> No.71375712

i make 12k per deployed day at comex maybe you should change profession retard

>> No.71375718

That's actually pretty based to fight your opponent at their best while you're still gimped. Now compare that to Jewtel who have pages of fine print to not get sued for their false advertising.

Although AMD should have mentioned this from the very beginning. Without further information, these benchmarks look kinda unimpressive for the normal, uninformed consumer.

>> No.71375727

>amd talks about true 1080p streaming
>retard nvidiot keeps spamming about 720p pajeet resolution

why it doesnt suprises me the least that people go so far with their buyers remorse

>> No.71375750

>he is engaged
finding a thot isnt an achievement
>bought a house
paying your mother a rent for the basement you live on doesnt make it a house

>> No.71375759
File: 288 KB, 552x661, intel shill 2.png [View same] [iqdb] [saucenao] [google] [report]

We just want good products at the best products and, at the moment, AMD is coming out with those. When Intel comes out with them I will be more than happy to get excited about that. But currently that's not the case. So this isn't brand loyalty at all - it's the complete opposite. It's excitement for good products, no matter where they come from.

Basically, this: >>71374531

1. We saw yesterday that AMD has equivalent gaming performance, beating it in many games, and that it can do tons of other shit in the background at the same time, while Intel while grind to a halt.
2. The most comparable Intel chip I could find is the i9-9960X, which costs $1,600, compared to $750 (less than half the price) for the Ryzen 3950X. Oh yeah, and the Ryzen has nearly three times as much L3 cache (22MB vs. 64MB).
This is the biggest load of cope I've ever seen. You're pathetic.

Source for cache sizes:

>> No.71375772

>>amd talks about true 1080p streaming
>>retard nvidiot keeps spamming about 720p pajeet resolution
It doesn't even matter at any resolution and any bandwidth limit higher quality encoding is going to look better which makes his comments even more

>> No.71375799

>Team Read
>Team Blue

Well done, idiots. We're the new consolefags. PC master race no more.

>> No.71375828

>gpu encoding is better than cpu one


>> No.71375837
File: 85 KB, 2000x2000, chrome-big.png [View same] [iqdb] [saucenao] [google] [report]

Mister Anderson, great to see you again.
It's just you, and
And me too!

>> No.71375842

>We saw yesterday that AMD has equivalent gaming performance
Yea in AMDs own slides, which are always cherry picked.

Imagine real world results

>> No.71375854

I never said it's better. But you're lying if you don't think it's the most commonly used.

>> No.71375874

>Oh and AMD also said that they ran all of their Intel system tests that they shared today without software and firmware mitigations in place for these security flaws, which gives Intel a much better shot at 'Best-Case Scenario' performance.

>> No.71375876

Is AMD becoming increasingly successful because they embraced cooperation with China?

>> No.71375881

As has been pointed out a half dozen times already, there is no significant hit to gaming performance with the patches enabled anyway, so it's kinda retarded to keep bringing it up as if it actually mattered.

>> No.71376044
File: 24 KB, 267x297, intel shill 3.jpg [View same] [iqdb] [saucenao] [google] [report]

>We just want good products at the best products
Whoops I meant at the best price, of course.

They showed results for a ton of different games, including ones where Intel was slightly ahead, as well as ones where AMD was ahead. Overall it was almost exactly the same.

Cope more you fucking shill.

>> No.71376114

I think it's just because they invested in a new architecture, employing that guy Jim Keller to do it, who is obviously a smart dude.

Intel, meanwhile, got complacent. Kept failing at the die shrink and kept releasing 14nm++.

But obviously Intel has now hired Jim Keller so I'm sure they'll come out with something cool in a few years' time. But for the next couple years we could see AMD stealing a fair bit of market share.

>> No.71376125

>They showed results for a ton of different games, including ones where Intel was slightly ahead, as well as ones where AMD was ahead. Overall it was almost exactly the same.
Again, if you trust AMD, you're a fuck head and deserve to be cucked


AMD is WELL known for lying out their asses in presentations, settings that specifically favor their cards or CPUs, and specifically put intel/Nvidia at a disadvantage.

>> No.71376127
File: 230 KB, 1017x785, 1552962637496.png [View same] [iqdb] [saucenao] [google] [report]

Nigga what
I can stream with a 2700x no problem. Could even get away with 400 american kgs of ass on Linux with a V56.

>> No.71376134

Jim Keller was poached by Raja Koduri to work on GPUs no?

>> No.71376154

with software encoding?

Most people do single computer streaming with a GPU, which has fixed encoding settings.

Software encoding allows FAR more flexibility in the encode quality and settings.

>> No.71376173


He was put in charge of basically intel's entire product stack.


>Senior Vice President of the Silicon Engineering Group

basically any and all silicon engineering, from ultra mobile to massive scalable Xeons.

>> No.71376184

>with software encoding?
Yes, but it wasn't on a good preset. The point is that it could still deliver a watchable experience and it can only get better from there by having more cores.

>> No.71376201

because the same never happens from the other sides

don't trust fucking anybody

>> No.71376202

Sure, but so you're not fucking with context switching so much, most people will just go with a 2 computer setup.

The advantage of a single box is really only found in it's size.

if you simply CAN'T fit a 2nd PC for dedicated encoding/streaming, then sure I guess it makes sense.

But otherwise you're still better off with a 2nd PC.

>> No.71376237

I thought he was just on mobile(laptop) shit. If he's on everything, then we can expect something really good on 2022-2023. AMD has hit gold and the way their moving up doesn't worry me one bit, but having that Intel competition is going to be great. AMD can eventually stack dies vertically, they can add more threads, and they will probably eliminate most of the discrete GPU market with APUs and stacked DRAM. By then, I would not be surprised if we had a 32-64 core APU with a GPU murdering the current 2080Ti.

>> No.71376264

Yeah, it will take several years for his work to really bring any significant changes, but I wouldn't be shocked by a massive architectural shift in the next 2-5 years, depending on the product category.

>> No.71376290


>> No.71376309

this is too ridiculous to be real

>> No.71376322

based shitwrecker.
seeing how successful zen/zen2 is, i'd upgrade to whatever he worked on when it releases in 2022/2023

>> No.71376327

I'm still having PTSD flashback to the hype they made around Bulldozer and then what they actually delivered

>> No.71376349

It's not, they simply put an 8 core vs 12 core CPU and those extra 4c/8t are enough for encoding while intel chokes having all cores focusing on the game.

>> No.71376350

Of course, intel are well known liars too, but intel isn't the one being shilled right now, it's AMD.

All i'm saying is wait for real world testing before jumping on the bandwagon.

Yup, any number of releases from AMD over the last decade to be honest.

>> No.71376373
File: 33 KB, 754x343, 2019-06-11 10_33_14.png [View same] [iqdb] [saucenao] [google] [report]

Yup, and they chose a game which basically doesn't scale much past 6-8 cores.

>> No.71376385

wtf? are you poor or what? lmao Intel is the poorfag option now ahahahahhahahaahhaha
>implying horseshoe theory cannot be applied everywhere

>> No.71376401

>up to +25% higher temps
ftfy Intel.

>> No.71376418
File: 175 KB, 349x345, 8bd.png [View same] [iqdb] [saucenao] [google] [report]


>> No.71376420
File: 46 KB, 150x262, 1523858854593.png [View same] [iqdb] [saucenao] [google] [report]

>"we didn't turn on security mitigations for intel"
>"these results may have been under optimal conditions only for Intel hardware"

>> No.71376450
File: 909 KB, 325x498, 669d7dca5e06858257d49e3a169fa990.gif [View same] [iqdb] [saucenao] [google] [report]

can't wait for 7/7 benchmarks

>> No.71376460

>bringing up security mitigation that have little to no impact on gaming


>> No.71376481

>missing the point

>> No.71376487

There's other things to consider than just the mitigations though. There's cooling (inadequate cooling will prevent proper boosting), RAM speeds, timings. There's no info on that.

>> No.71376504

No i get the point, AMD want to make themselves look like they're doing better than they actually are by implying they're up against intel at their best.

You're a fool for trusting AMD performance metrics, regardless of what is going on with intel.

>> No.71376549

I also suspects AMD is gonna pull an intel when it comes to TDP now. There's no way a 16c/32t has same TDP as a 12c/24t and even at a higher clock. Binning or whatnot, there's no miracles - there's only marketing tomfuckery. Basically AMD TDP rating also becomes meaningless now.

>> No.71376560

Forgot the 8c 3800x also "105"W "TDP"

>> No.71376571
File: 34 KB, 638x288, 2019-06-11 10_49_00.png [View same] [iqdb] [saucenao] [google] [report]

Yup, 8c at 3.9Ghz base is 105w
16c at 3.5Ghz base is also 105w

just doesn't make sense unless they've done SOMETHING with how they measure TDP.

>> No.71376799
File: 81 KB, 1000x667, laugh.jpg [View same] [iqdb] [saucenao] [google] [report]

>being this butthurt because the AMD performed equally to the Intel in a bunch of games, and better in some of them

>> No.71376809


>> No.71376823

i don't get it, how can one guy be so based, what does he know that nobody else knows

>> No.71376835

>what is binning
If they went 4 core CCD to make the 8 cores, then we'd be looking at bottom of the barrel chiplets. The full 8 core chiplets should be much better binned, just look at Rome.

>> No.71376862
File: 76 KB, 500x500, 1553188191655.png [View same] [iqdb] [saucenao] [google] [report]

Wake me up when any "ryzen" beats my 11yo 2500k in any bench


>> No.71376870

this, they use very contrived benchmarks with cinebench, blender, and now streaming this specific game with software encoding on the slow setting, implying streamers can't just use a second pc for optimal stability

>> No.71376900

Is pretending to be retarded still fun? Needs to be very subtle for it to be amusing desu.

>> No.71376902
File: 1.97 MB, 400x200, nic cage laughing.webm [View same] [iqdb] [saucenao] [google] [report]

>disabling hyper-threading, causing a 40% loss in CPU performance, will have "little to no impact on gaming"

>> No.71376912

Shut the fuck up. Flamewars have been a thing since the BBS days.

>> No.71376970

Yes I would definitely expect that he will be able to create a great design for Intel and Intel will come back.

That's fine though, competition is good. Better processors for us, hopefully at cheaper prices.

I hope AMD can continue to offer great products too, though, because otherwise Intel will charge extortionate prices due to lack of competition.

Dunno, just a smart dude I guess. Like the John Carmack of CPUs.

>> No.71376975

>What is up with all the AMD cocksucking on this site?
AMD hires pajeets and Chinks to do online shilling.

It's like Applefags whenever there is an Apple event.

>> No.71377005

>what is binning
Here we go again, ayymdrones thinking binning is some kind of magic.

>> No.71377013

People don't realize that AMD fanboys have been living under a rock for ages and now that Ryzen is good they're coming out.

They've been waiting for this moment since 2006. These fanboys are the same idiots who stuck with AMD despite the E8400, Q6600, i7 920 and 2500k existing.

>> No.71377036
File: 403 KB, 1651x1651, doublediffused.jpg [View same] [iqdb] [saucenao] [google] [report]

I was curious about that and yes they do say diffused in US and Taiwan so IO chiplet is GloFo yet.

>> No.71377042

Phenom II, especially hexacore Thubans, wasn't a disappointment, even if it came a little too late.

>> No.71377080
File: 32 KB, 506x337, laugh5.jpg [View same] [iqdb] [saucenao] [google] [report]

No mate. I have no allegiances to any company because I'm not a corporate cuckold like you. Intel has been the better choice in some years, and now AMD is a better choice for many people, because the gaming performance is roughly equivalent, while having vastly superior multithreaded performance, and also being cheaper. Oh yeah, and not having security flaws that require you to disable hyper-threading, reducing the performance of your CPU by 40%. Oh yeah, and AMD has APUs in the low/mid segments which blow Intel's iGPUs to smithereens.

Stay mad though.

>> No.71377082
File: 1.25 MB, 1002x1480, 1526445578934.png [View same] [iqdb] [saucenao] [google] [report]

>having the cash to afford shilling

>> No.71377084

disabling hyperthreading is only relevant if you're in a virtual environment and don't want your other VMs to be able to potentially read the memory of other VMs running on the same machine.

It has no real impact for gamers at home.

Which is why intel DOESN'T recommend disabling hyperthreading UNLESS you're in an environment that would actually be vulnerable to this attack type. Like a VM host provider.

>> No.71377095

Imagine what would've been if they did Phenom right at the first try instead of releasing a low clocked buggy mess.

>> No.71377105

>still lying blatantly even though you've been told dozens of times you're wrong.

>not a shill


>> No.71377130
File: 268 KB, 499x499, 1414228679001.png [View same] [iqdb] [saucenao] [google] [report]

>implying your 2500k could beat my 20 year old Q9550

>> No.71377195

he went on amd to work on k12 and the mobile parts anon he had nothing to do with zen

>> No.71377208

>In August 2012, Jim Keller returned to AMD, where his primary task was to design a new generation microarchitecture[5][11][15] called Zen.[14] After years of being unable to compete with Intel in the high-end CPU market, the new generation of Zen processors is hoped to restore AMD's position in the high-end x86-64 processor market.[3][13] On September 18, 2015, Keller departed from AMD to pursue other opportunities, ending his three-year employment at AMD.[20]

If you have other sources, feel free to change wiki


>> No.71377292

Anyone have the table with all the zen 2 CPUs?
Can't seem to find it

>> No.71377447

poo in the loo

>> No.71377485

you have to go back

>> No.71377488


>> No.71377505

When you put price to the benchmarks, they look pretty good though already.
AMD is BTFOing Intel without even using dirty tricks, even though they can since this isn't even dirty, remotely.

Intel dug its own grave.

>> No.71377644

>without even using dirty tricks
ehhh, they're cherry picking the software AND making the settings something that would already favor AMD.

>> No.71377659
File: 364 KB, 1773x996, y8nxtm08um331.png [View same] [iqdb] [saucenao] [google] [report]

Why nobody talked about this?

>> No.71377665
File: 173 KB, 1984x1044, ryzen_3000_process_voltage_-100798921-orig.jpg [View same] [iqdb] [saucenao] [google] [report]

unironically using charts that look like they're straight out of a /g/ meme

you jackasses overhype every single amd release, if amd turns out to be ACTUALLY competitive outside of cherrypicked benchmarks, and have better price/performance when including the cost of the motherboard etc, i'm guessing intel will adjust their prices

>> No.71377682

if you're running an overclock you keep the clock the same anyway

>> No.71377702
File: 209 KB, 600x582, 025_WPN78XQ.jpg [View same] [iqdb] [saucenao] [google] [report]

>if amd turns out to be ACTUALLY competitive
we both know this won't happen
at every single price-point Aymd gets pwned by

But i guess let the obese autists have their 5 minutes of fun

>> No.71377709


>> No.71377711

Nobody buys Intel except rabid fanboys.

>> No.71377726

>Nobody buys AYMD except obese neckbeards.
fixed that 4 u

>> No.71377729

The fucking scheduler is fixed.

>> No.71377731

Exhibit #1

>> No.71377750

>1.6 fps

inToddlers eternally BTFO PFTTHAHAHAHHAH

>> No.71377801
File: 833 KB, 350x197, HatersGonnaHate - Kuroko.gif [View same] [iqdb] [saucenao] [google] [report]

>i'm guessing intel will adjust their prices
Thanks for outing yourself not knowing shit.

Stop using my lesbian wife to shitpost.

>> No.71377822

Let's go over the facts, shall we:
>some vulnerabilities found, but ALL of them have since been patched
>Intel ME has been locked down, audited by security experts, and is no longer a security risk
>patches have NEGLIGIBLE impact on gaming
>storage IO takes a slight hit
>much higher clocks (better for older games)
>much lower core and cache latencies (very important for gaming)
>more industry-adopted instruction sets, better optimization in Unreal Engine, Unity, Frostbite, etc.
>uses less power-per-clock and has better frequency-to-voltage characteristics (why Intel will always hit 5.0GHz and AMD will never be able to)

>Ryzenfall and 12 other vulnerabilities STILL NOT COMPLETELY FIXED
>PSP has been fully compromised and gives attackers complete access to all parts of an AMD CPU; AMD REFUSES TO DISABLE PSP ON ANY RYZEN CPU
>has performance loss with its already lower IPC (turning a shit hatchback into a shittier hatchback)
>literally slower NVMe throughput by default because AMD can't into IO queuing optimizations
>lower clocks than Intel's First Gen Core architecture
>almost twice or three times the latency depending on which cores are accessing which cache; made even worse with multi-die Ryzens (like all of the Ryzen 3000s)
>literally no one optimizes for AMD's instruction sets; locked out of the instruction sets the games industry uses
>uses far more power per watt because Infinity Fabric eats power the higher you clock it (guess what was exponentially increased on Ryzen 3000...)

Other notes - PICe Gen 4 literally doesn't matter
>storage IO loss literally doesn't matter for gaming because 99.99% of games doesn't come close to using a tenth of an NVMe drive's IO throughput
>4K 144hz gaming doesn't even saturate PCIe 3.0 x8

>> No.71377837
File: 39 KB, 700x692, brainlet9.jpg [View same] [iqdb] [saucenao] [google] [report]

Where exactly have the claims of that post been shown to be wrong? Name a single claim from that post that you think is wrong.

You're a fucking idiot, aren't you?

>> No.71377885

>require you to disable hyper-threading, reducing the performance of your CPU by 40%
Blatant lie right here.

No one is REQUIRING you to do it, unless you're a VM host.

A home user playing games would have no reason to do this.

>> No.71377888

Oh, you're really going for it. Hope you're getting paid, anon.

>> No.71377890
File: 29 KB, 426x146, intel cope.png [View same] [iqdb] [saucenao] [google] [report]

>some vulnerabilities found, but ALL of them have since been patched
Stopped reading right there. There is NO patch for MDS, it LITERALLY requires you to disable hyper-threading as the ONLY way to mitigate it, resulting in a 40% drop in performance.

Cope harder you fucking shill. Do you really think we're going to fall for your marketing? How much do you get paid for this shit?

>> No.71377913
File: 77 KB, 645x729, brainlet11.jpg [View same] [iqdb] [saucenao] [google] [report]

>just live with a gaping hardware-level vulnerability in your processor that CANNOT be patched and can only be mitigated by disabling hyper-threading, reducing performance by 40%, bro!
How much do you get paid for this shilling?

>> No.71377917

>There is NO patch for MDS,
Then what do you call the May 2019 updates?

>> No.71377982

Not a full mitigation, because it isn't. You must disable HT for full mitigation, otherwise you are still vulnerable. Here's Apple's analysis for their customers - all Macs use Intel processors:
>Full mitigation requires using the Terminal app to enable an additional CPU instruction and disable hyper-threading processing technology. This capability... may reduce performance by up to 40 percent, with the most impact on intensive computing tasks that are highly multithreaded.

>> No.71377987

0.01 rupees per shitpost

>> No.71378011

and Inztel shills bring that up, despite all their OTHER vulnurabilities? ;^)

>> No.71378012

lmao, one that only exists in a virtual environment that 99% of people aren't ever in?

What the fuck are you dumb?

>> No.71378013

And it's not even recommended by Apple because the vulnerability is so difficult to exploit. There will NEVER be a real world attack based on MDS in the wild unless you had a billion Pajeets creating hundreds of lines of code specifically for your exact combination of hardware and running applications. You have a higher chance of getting hit by a comet than getting hit by an MDS-related attack.

>> No.71378019

the hyper threading vulnerability only exists in virtual environments, you'd have to be retarded to enable it for EVERYONE.

>> No.71378027

Uhh but it's actually more expensive than 9900k


>> No.71378038

Since when is Jensen a fucking tranny?!

>> No.71378054

by a lot too.

I can get a 9900k for $480 at my local microcenter.

I wonder what they'll have the 3950X for.

>> No.71378071

intel lost on cs:go
that gunna hurt

>> No.71378075
File: 44 KB, 480x480, 121_PZGZla5.jpg [View same] [iqdb] [saucenao] [google] [report]

congrats, you passed the IQ test
only brainlets think "ayyyyyyymd" is the budget option

meanwhile intel has better offerings at ANY price level

>> No.71378110

begone shill

>> No.71378111

Are you stupid? Every big streamer uses 2 PCs. It has nothing to do with performance, they stream off a second PC so nothing can ever interrupt their stream. If something goes wrong, game crash, windows shits the bed, mandatory reboot, etc on their gaming PC. They will always have their stream live. Viewers are fickle retards, and if your stream ends for any reason at all, even if you say "I AM GOING TO REBOOT" you will lose 50% of your viewers for the rest of the day. The second a stream says Offline, half your viewers are closing the tab.

>> No.71378114


>> No.71378154

Intel is a piece of shit company with backdoors built into their architecture in conjunction with the NSA. Nsa clearance was required on the architecture team. The patches to address the security loopholes and their detriment to performance warrant a lawsuit. There is no justifying this behavior you absolute retards.

>> No.71378174


Performance has nothing to do with it. If you want an actual specific example look at Shroud, he uses a 12 core processor fully capable of streaming while gaming and did for years. He ended up buying a second streaming PC because at the time PUBG had a habit of hard locking your PC and every time it did he'd come back to 30% of his viewers left.

>> No.71378245

Lisa Su anime when?

>> No.71378253

Probably yeah. He should probably get a better job than shilling broken, overpriced processors.

MDS is the one I've read about, I know there were others last year or something but I never bothered to read about them at the time.


Google even went so far as to disable HT in Chrome OS, so if you bought an expensive Chromebook Pixel, then sorry, your performance is now cut by up to 40%. You can apparently re-enable HT in the flags, but the fact that Google have disabled it by default shows you how concerned they are by it.

Yeah, Intel came out saying "no you don't need to disable HT!" because they care more about profit than they do about their customers' safety. You DO need to disable HT if you want full mitigation. You can decide to take the risk and not disable HT, but you WILL remain vulnerable, NO MATTER WHAT ENVIRONMENT YOU ARE IN.

>all of this phenomenal cope
Apple DOES recommend disabling HT for "customers with computers at heightened risk or who run untrusted software on their Mac".

To be honest, that's anyone. Macs have Gatekeeper, which doesn't let you install any "untrusted" apps unless you go into System Preferences to allow that specific installation. "Untrusted" means anything downloaded from outside the App Store. So really this will apply to millions, probably most Mac users.

>There will NEVER be a real world attack based on MDS
This is exactly the same complacency that led to AMD coming along and making more compelling products.

No, it exists in EVERY environment as far as I'm aware - it's just a risk analysis thing. ALL Intel CPUs with HT produced for many years have this vulnerability. It is just a case of deciding whether your risk is high enough that you can accept the big hit in performance.

>> No.71378280
File: 57 KB, 645x729, brainlet2.jpg [View same] [iqdb] [saucenao] [google] [report]

>the top 10 streamers on twitch use a dedicated second PC for streaming, therefore EVERYONE who wants to stream should TOTALLY buy a second PC, instead of just one PC that is capable of streaming by itself

>> No.71378337

Yes, that's what i thought, isn't 10000kb/s a bit of an overkill? With 6 or 8 mb/s intel would probably perform much better.

>> No.71378360

Good thing I don't play shit that locks my computer to that degree.

>> No.71378412

Actually I will correct myself - it looks like Apple doesn't use the word "recommend", whether it's for keeping HT or disabling it, in their security pages. So they are just giving the customer information, but not making an explicit recommendation one way or the other.

But the fact remains that they still suggest disabling HT for customers who run untrusted software on their Mac and want full mitigation:
>Although there are no known exploits affecting customers at the time of this writing, customers with computers at heightened risk or who run untrusted software on their Mac can optionally enable full mitigation to prevent harmful apps from exploiting these vulnerabilities. Full mitigation requires using the Terminal app to enable an additional CPU instruction and disable hyper-threading processing technology. This capability is available for macOS Mojave, High Sierra, and Sierra in the latest security updates and may reduce performance by up to 40 percent, with the most impact on intensive computing tasks that are highly multithreaded. Learn how to enable full mitigation.

>> No.71378441

Bend the knee and admit it

I'm always right

>> No.71378443

>there is no significant hit to gaming performance with the patches enabled anyway
So you're saying there's 0 difference with ""hyperthreading"" disabled?

>> No.71378625

They specifically mention disabling HT as a suggestion for customers running "untrusted software", which in the Apple world literally means ANYTHING that wasn't downloaded through the App Store. So Chrome and Firefox are untrusted according to Apple, because they're not in the App Store. And besides, even if you as a person trust Chrome, you might go to a website with JavaScript that could potentially exploit MDS.

So no, you're not right, and you're baselessly claiming that something will "never" happen when it is impossible to make such a guarantee.

>> No.71378646

Apple says there's a huge impact to performance with HT disabled, up to 40%:
>Full mitigation requires using the Terminal app to enable an additional CPU instruction and disable hyper-threading processing technology. This capability... may reduce performance by up to 40 percent, with the most impact on intensive computing tasks that are highly multithreaded.

>> No.71378686

He looks so confused

>> No.71378694

if you're a hotshot streamer that can afford a top of the line cpu, you can afford a second pc, it's better not to choke the system when you can offload a lot of the work instead

>> No.71378697

Why is that asian man wearing high heels? Why does he have the slight body figure of a large female?
Is .... is that Weird Al in disguise as Michael Jackson in disguise as an Asian with a male head on a female body?

>> No.71378714

I would be confused too if I had to play with that fucking Monitor set up

>> No.71379005
File: 59 KB, 493x493, megatasking.jpg [View same] [iqdb] [saucenao] [google] [report]


>> No.71379015

>making your life more complicated for minimal gains in stability

>> No.71379025

>putting zero effort into your 6-7 figures source of income

>> No.71379040


>> No.71379083

holy shit, dude
i never imagined it'd be this soon before we could use presets like that for real-time high fps streams

>> No.71379159


>> No.71379166

Explain to me why this is great smart anon?

>> No.71379198

because i've come to expect <10fps for that kind of encoding
while i don't care about streaming, encoding stuff for storage i do care about, and of course i use software encoders for that despite it's speed

>> No.71379220

its good for streaming as it allows a better looking video quality with the same bandwidth. This means your audience will see things better than other streamers doing lower quality encoding. Especially streamers that use GPU encoding.

>> No.71379228

Not sure what's the big deal, I can stream PUBG with in 1080p with my 4790K and maintain 70+ FPS in game.

>> No.71379238

>Asian women make bad drivers

>> No.71379263
File: 101 KB, 685x794, intel shill 3.jpg [View same] [iqdb] [saucenao] [google] [report]


>> No.71379280

it's exactly how I mega-task.

btw I don't know why, but Windows seem to slow down when I MEGA-TASK. jokes aside. I had this on my old 4700HQ, thought it's the laptop, cooling, whatever. but now it's the same with my 4790K. usually happens when I have like 30 apps open or so, and yes, I do actively do work on them all.

for some reason the new windows start to open slower, re-opening old ones is slower... all that stuff and it annoys the fuck out of me. CPU usage is like 10-50% (VM is idle, IDE is idle, I just run several apps, but nothing tasking).

what can fix this?
or is Windows desktop just shit at MEGA TASKING?
(gotta use my new fav meme word)
if I buy a 3900X with decent RAM amount (16-24gb) and speed... will this stop happening?

>> No.71379290

AMD did really well showing off their Cinebench and game FPS scores legitimately but this benchmark was crazy disingenuous.

Division runs fine one 4 cores, you can affinity lock the game to 4 and OBS to the other 4 and stream pretty much flawlessly. All they did here was intentionally set OBS to encode on all cores to gimp the performance well under what anyone would get streaming properly. You can bet your ass they set affinities for their set of the test because even at 16 cores you'd lose way more performance than they showed if you let PBS run x264 across all cores.

>> No.71379314

Ok this is kinda shit since the only version worth using LTSC which didn't get this update.

>> No.71379315

Okay hold on now. I've done multiple side by side comparisons between CPU and GPU encoding, at low bitrates and high, and I can't detect even the slightest difference in quality.

But hey, what do I know? I only make videos for a living.

>> No.71379320

Yeah, anyone who knows a bit about streaming can see that.

Not sure why AMD pulls this shit, even if they perform worse than intel in some respects if they didn't, they'd at least have the moral high ground.

Now they just look dumb.

>> No.71379323

You'd be wiping out half of Israel.

Please delete this VERY anti-semitic comment immediately.

>> No.71379324

>doubling down on the MOAR CORES meme so hard that you have to show MEGA-TASKING benchmarks with a specific multithreaded game and software encoded streaming, when this isn't relevant to 99% of users or even the streamers themselves since they would rather use a dedicated streaming pc to offload work from the gaming pc

>> No.71379387

I'm with >>71378280 on this.

But, >>71378694 is also right.

If you're a small streamer (ie not making a living off it) then it's simply a waste of money to build an entire second computer for that extremely rare occasion where your main PC locks up.

If you're a big streamer, spending another $500 or so on a second computer dedicated to running the stream is a minor expense and makes a lot of sense to do.

It makes sense for some people, doesn't make sense for others. Just like everything in life (except sunscreen).

>> No.71379440

>gpu-bound var in the left
How can people do this? The idea of deliberately engineering cases like these makes me sick

>> No.71379631

Fall Guys
Luigi's Mansion 3
Collection of Mana
Way to the woods
Panzer Dragoon

>> No.71379724
File: 239 KB, 1999x1023, 1560209862776.jpg [View same] [iqdb] [saucenao] [google] [report]

Here you go, faggot, 1080p pleb gayming

>> No.71379734

> intel is higher on several charts
> number is still near AMD like its better

>> No.71379752 [DELETED] 

>amd is higher on several charts
>number is still near intel like its better

>> No.71379831

2 PC streaming setups definitely have a major advantage with the fact that the main system is not using any resources for video game capture, OBS rendering/compositing and video encoding but they also come with their own share of issues. The setup is much more complex in general (audio routing too) and one of its major difficulties is with the resolution and refresh rate your gaming PC's monitor is running at. You generally need to ensure that it remains compatible not just with your monitor, but with your capture card as well. Good enough capture cards generally exist nowadays, but playing on a 1440p 144Hz+ monitor and attempting to capture that isn't necessarily something which will always work flawlessly with any capture card out there. Cloning the display via Windows doesn't always play nice with different resolutions and refresh rates either. It's not as easy as it seems to get a stutter, tearing and judder-free stream out at 1080p 60FPS or something if you're using a 1440p 165Hz monitor for instance (and let's not bother with 4K or 4K 120Hz+ monitors). A separate capture PC solves some issues but also introduces quite a few points of potential technical difficulty as well.

The single-PC setup is comparatively much simpler and much easier to deal with in a lot of cases, so if you've got a PC fast enough to do it all without suffering performance issues that may still be the preferred solution. It also lets you get monitor upgrades without constantly worrying about requiring a capture card upgrade to match.

>> No.71379853

Anon, are you retarded? The numbers are on top of AMD bars because they're showing AMD performance.

>> No.71379869

>unironically using charts that look like they're straight out of a /g/ meme
What? That's a perfectly reasonable scale for the axis.

>> No.71380079
File: 162 KB, 633x900, intel shill 4.png [View same] [iqdb] [saucenao] [google] [report]

It's really as simple as this:
>equivalent single-threaded performance
>vastly superior multi-threaded performance
>half the price

>> No.71380124


>> No.71380140

>still inferior in IPC
>still can't clock for shit
>needs a massive cooler for muh XFR/PBO
>needs magical 4000MHz+ RAM
>needs a $500 mobo if you don't want it to blow up
>m-muh multitasking
9400f is still the value king, cope harder AMDrone

>> No.71380175

>still can't clock for shit
still breaks LN2 benchmark records

>> No.71380216

still no 5ghz

>> No.71380253

have we arrived back to Athlon X64 times where AMD needs to use 5000+ style markup because inteltards

>> No.71380278

BO3? When's this from? 2 years ago?

>> No.71380527

i mean how it's unnecessary to plot the max frequency like that. almost like the meme with the nvidia cards with 980, 1080 and 2080 with the bars according to their model numer. the frequency isn't the only thing that matters. intel wouldn't make a diagram like that.

>> No.71380650


>> No.71380774
File: 224 KB, 570x612, intel shill 6.png [View same] [iqdb] [saucenao] [google] [report]

>Best All-Round Value CPU
>AMD Ryzen 5 2600
>The true Intel alternative to the Ryzen 2600 right now is the Core i5-9400F.
>Personally we'd still go with the 2600X since you can overclock it and the AM4 platform offers a significantly better upgrade path

>Ryzen 3600 will get you at least 15% more performance at the same frequencies.

>> No.71380884


>> No.71381001

based and redpilled

acidic and bluepilled

>> No.71381127

You've to be retarded to not stream with your gpu.

>> No.71381240

it's OK Jensen, you will definitely survive the APU apocalypse in 3-4 years. I believe in you.

>> No.71381246

>Not giving your GPU all the resources it needs for actual game play.

Name (leave empty)
Comment (leave empty)
Password [?]Password used for file deletion.