[ 3 / biz / cgl / ck / diy / fa / ic / jp / lit / sci / vr / vt ] [ index / top / reports ] [ become a patron ] [ status ]
2023-11: Warosu is now out of extended maintenance.

/sci/ - Science & Math


View post   

File: 460 KB, 1340x952, cpu.png [View same] [iqdb] [saucenao] [google]
12018574 No.12018574 [Reply] [Original]

>> No.12018581

>>12018574
We might, but I highly doubt Moore's Law or hardware innovations will be the reason why. AI is primarily an theory/software problem.

>> No.12018611

>>12018581
Is it really though? The more data processing that can be done through algorithms would surely give enough conversational and knowledge potential to replace humans?
AI is already becoming pretty scary from what i have tested. AI is already capable of writing an entire story book based on the sentences you give it for context on the details of the story theme.

Sometimes i feel like this whole diversity in STEM and ruining prestigious colleges is an elaborate conspiracy by people afraid of AI.
Trying to slow technological progress while they think how they can keep humans relevant in an increasingly non-human world.

>> No.12018618

>>12018581
You probably can write something which will transform you into God on a piece of paper, but finding it is a more difficult task. When you have exaflops, you can search things more efficiently compared to teraflops.

>> No.12018661

>>12018611
>is it really though
Yes

>> No.12018795

>even if the progress will completely stop now, you still can adjust ai dungeon into a robocatgirl
Life is great.

>> No.12018998

>>12018611 actually, if you look at Moors Law as a subset of data processing speed, the logarithmic curve remains goes back thousands of years (number system, zero, abacus, printing, logarithms, computers transistor density (Moor.s Law), internet and on). Rate at which we talk to each other, and learn from each other is increasing logarithmicly - and will continue until we are no longer capable of keeping up (population crash?).

>> No.12019089

>AI singularity

it's still debatable if hard ai is possible to be ever achieved, no one knows if a machine can become sentient, we barely understand conscience still

>> No.12019189

>>12018574
I maintain that the next step in humanity's evolution is to integrate our minds with AI with a mind-machine interface.

AI can't exterminate humanity of we are the AI.

>> No.12019202

>>12019089
Sentience is just high order executive function glued together with computational ability. Of course, creating an AI that's able to combine the two is much, much harder than it sounds.

>> No.12019299

>>12018611
>AI is already capable of writing an entire story book based on the sentences you give it for context on the details of the story theme.
Sure, but it's not gonna be good or make 100% sense. I'm all for AI being more widespread but it's gotta get better at creative thinking.

>> No.12019321

>>12018611
The most lucrative node size is 14nm. Most of the software can’t even efficiently harness all the compute that the hardware gives. The semiconductor industry could sit on ass for another 10 years and let the software industry catch up and it would be ok.

>> No.12019330

>>12019321
Yeah no. Deep learning is hardware constrained. Stop reading singularity roleplay blogs.

>> No.12019335

>>12018581
AI is a hardware problem, but the problem is not stuffing more wires in a smaller box.

>> No.12019345

>>12019330
>deep learning is “most” software
>deep learning is efficient in its compute

Go reread my post

>> No.12019349

>>12018581
fpbp
the essence of AGI will probably be easy, what we need is a breakthrough, not more hardware to throw at the problem.

>> No.12019356
File: 158 KB, 406x395, I TRIED TO WARN YOU.png [View same] [iqdb] [saucenao] [google]
12019356

>>12018574
>The techies' belief-system can best be explained as a religious phenomenon, to which we may give the name "Technianity." It's true that Technianity at this point is not strictly speaking a religion, because it has not yet developed anything resembling a uniform body of doctrine; the techies' beliefs are widely varied. In this respect Technianity probably resembles the inceptive stages of many other religions. Nevertheless, Technianity already has the earmarks of an apocalyptic and millenarian cult: In most versions it anticipates a cataclysmic event, the Singularity, which is the point at which technological progress is supposed to become so rapid as to resemble an explosion. This is analogous to the Judgment Day of Christian mythology or the Revolution of Marxist mythology. The cataclysmic event is supposed to be followed by the arrival of techno-utopia (analogous to the Kingdom of God or the Worker's Paradise). Technianity has a favored minority-the Elect-consisting of the techies (equivalent to the True Believers of Christianity or the Proletariat of the Marxists). The Elect of Technianity, like that of Christianity, is destined to Eternal Life; though this element is missing from Marxism.

>Historically, millenarian cults have tended to emerge at "times of great social change or crisis." This suggests that the techies' beliefs reflect not a genuine confidence in technology, but rather their own anxieties about the future of the technological society-anxieties from which they try to escape by creating a quasi-religious myth.

>> No.12019361
File: 250 KB, 1920x1080, compoootroniumer.jpg [View same] [iqdb] [saucenao] [google]
12019361

>> No.12019426
File: 803 KB, 624x762, star.png [View same] [iqdb] [saucenao] [google]
12019426

>>12018574
Kind of weird, isn't it. Out of all the humans that were and will be born, why was I so lucky to be born right now

>> No.12019607
File: 629 KB, 2656x1758, tanzania-overlanding-land-rover-3.jpg [View same] [iqdb] [saucenao] [google]
12019607

https://arxiv.org/abs/2007.05558

>The Computational Limits of Deep Learning

>Deep learning's recent history has been one of achievement: from triumphing over humans in the game of Go to world-leading performance in image recognition, voice recognition, translation, and other tasks. But this progress has come with a voracious appetite for computing power. This article reports on the computational demands of Deep Learning applications in five prominent application areas and shows that progress in all five is strongly reliant on increases in computing power. Extrapolating forward this reliance reveals that progress along current lines is rapidly becoming economically, technically, and environmentally unsustainable. Thus, continued progress in these applications will require dramatically more computationally-efficient methods, which will either have to come from changes to deep learning or from moving to other machine learning methods.

>> No.12019612

>>12019607
faggot

>> No.12019720

>>12018574
Friendly reminder that AI is just a buzzword because all we have in the end is the same old bruteforced ML algorithms. Instagram filters, videos/audio editors and a retarded car is what's popsci calls AI.

>> No.12019726

Friendly
Reminder
Hard ai will not have a binary architecture

>> No.12020225

>>12019426
not THAT lucky, 10% chance

>> No.12020269

>>12019726
It will have unicode architecture.

>> No.12020334

Yea we need a learning computer like the t-1000

>> No.12020622

>>12019349
>the essence of AGI will probably be easy
What horseshit. You have no idea, not an inkling of a clue.

>> No.12020636

>>12019720
Then why do they have to shut them down?
https://www.bitchute.com/video/5doYBvOjG67m/

>> No.12020739

>>12018581
Nnets only became viable because gaymersand crypto autists drove the demand of GPUs to a level it would have never gotten by AI potential alone. Hardware is everything for reinforced learning as the networks only learn through huge tweeks in the weights of the nodes

>> No.12021782

>>12018611
Errrm anyone with more iq than a piece of bologna knows the stories you like to read which are written by gpt3 are fucking retarded at best fuck you pendejo