Quantcast
[ 3 / biz / cgl / ck / diy / fa / g / ic / jp / lit / sci / tg / vr ] [ index / top / reports / report a bug ] [ 4plebs / archived.moe / rbt ]

Maintenance is complete! We got more disk space.
Become a Patron!

/g/ - Technology


View post   

[ Toggle deleted replies ]
File: 384 KB, 900x676, FAE506AC-4665-43BD-9CD3-3E2C94C38CDB.png [View same] [iqdb] [saucenao] [google] [report]
71683517 No.71683517 [Reply] [Original] [archived.moe] [rbt]

Newbie wanting to self-host a website, emails, file storage/backups and also general heavy computations (machine learning and such).
My current options are a Dell R210 II or an HP DL585 Gen7, both for the same price of 500 dollaroos.

What option go with

>> No.71683533

lol

>> No.71683638

>>71683517
>self-host a website
Is it just static content? You can literally do that for free on Github or any number of other places. Even hosting a site through an S3 bucket with a fucking CDN will cost you pennies a month.
>emails
Seriously, it’s not worth the effort and your ISP probably blocks port 25 anyway. Get a Fastmail subscription for like $45 per year.
>file storage/backups
How much storage are we talking here? It doesn’t take much horsepower to run a NAS and you could save your cash for hard drives and just get some low end mini-ITX board and case.
>also general heavy computations (machine learning and such).
A lot of these cheap server boards suck for that. First because the processors are nothing special and second because a lot of them won’t boot with a PCIe board that draws more than the max power provided by a PCIe slot (in other words, a decent GPU, which you’ll want for machine learning.

The thing with these big rack mounted boxes is they look cool in the “hey I’ve got a server” sense but unless you REALLY need serious power or a fuckload of disks they suck for home use. The power draw is insane and they usually sound like a fucking jet engine. Most people would be better served getting a modest mini-ITX board, a NUC, whatever.

>> No.71683659

>>71683517

I'm going to get a PowerEdge T340 for a business I have as a client. They want to share files between Mexico, Colombia, Brazil and Austria. It has RAID support and we will add cloud storage for backup, the thing is, you should not get a real server yet, you are better with a virtual one or paying hosting. Get a normal pc, maybe mini-ITX to save space and have good functionality. You don't need a server.

>> No.71683666

I run a website hosted on my home network. I have pass through a firewall and it's easy for development. ISPs really don't give a shit if you punch a hole in port 80 as long as the traffic isn't huge.

>> No.71683903

>>71683638
Very fair points, thanks. Assuming I still wanted to run my own stuff as opposed to buying a service (just for hobbyist sake), what lower power/less overkill things would you recommend? Could something as simple as an older desktop run all of this stuff without a hitch?

>> No.71683953

>>71683517
>>71683903
Firstly, any computer can be used as a web server with the right hardware & software. Hell; people have run web servers using Contiki on a Commodore 64, with a NIC cartridge. People have even run DOS-based web servers over a 56k modem with SIOUX.

Depending on what you want to do, you can go with a Raspberry Pi to host static or even somewhat dynamic websites (as long as it's lean). Stronger computers (including laptops & desktops from the mid-2000's) can be used well enough as NAS machines. Personally, I once ran a shell server with LAMP on an Acer Aspire One netbook with only 1 GB of RAM & a 160 GB HDD... With Xubuntu running with its full DE.

>> No.71683965

whenever I see somebody who "needs" something for "machine learning" its always obvious they dont know fuck all

>> No.71683992

>>71683965
Eh, I’ve just done some data science courses here and there and was interested in having the capability to dick around with ML if I felt like pursuing it. Admittedly I’ve not done anything involving massive datasets, so the hardware needed is foreign to me.

>> No.71684260

>>71683517
Get a desktop and a decent GPU. You'll be doing machine learning on the GPU, not the CPU. Servers make sense if you're practicing for work, not so much if you're just doing this as a hobby.

>> No.71684274

>>71684260
>You'll be doing machine learning on the GPU, not the CPU
wtf am I reading?

>> No.71684296

>>71684274
https://www.analyticsvidhya.com/blog/2017/05/gpus-necessary-for-deep-learning/

>> No.71684320

>>71684296
>https://www.analyticsvidhya.com/blog/2017/05/gpus-necessary-for-deep-learning/
>vidhya
>>>/v/

>> No.71684387

>>71684320
If you think GPUs are only useful for gaymen, you'd best get back to /v/

>> No.71684399

>>71684387
>Taking video game enthusiast blogs as actual information
>Tells others to go to /v/
I kek'd

>> No.71684536

>>71684399
>literally no background in AI or data analytics
>insists he knows what he's talking about
/g/ has turned into /v/....
https://www.zdnet.com/article/how-the-gpu-became-the-heart-of-ai-and-machine-learning/

>> No.71684604

>>71684536
>Taking zdnet and other blogs as valid sources
You're just as retarded as /lgbt/ and /mlp/ now.

>> No.71684624

>>71684604
Plebeian

>> No.71684790
File: 34 KB, 403x394, 382.jpg [View same] [iqdb] [saucenao] [google] [report]
71684790

>>71684624
See pic

>> No.71684825
File: 90 KB, 500x501, 1549496726514.png [View same] [iqdb] [saucenao] [google] [report]
71684825

>>71684790

>> No.71684833

>>71684825
>Still responding
He took my fishing pole :^|

>> No.71684916

>>71683638
>hey guys I want to do X
>don't do X.
>use a literal botnet instead
Every fucking time
>>71683903
Unless you have a separate room or closet I don't suggest anything smaller than 2U because of noise.
If you want to have a few vm's I recommend the dell R710 since it's 2U and has more expansion slots.
Keep in mind if you're adding gpu's you might have to mod a way to get pcie power.
I haven't used a lot of hp gear but if you plan on doing ML I'm assuming you plan on adding a few gpu's so more slots is probably better.
Depending on where you live ebay is a great place to find cheap old server shit.
There's no need for the latest and greatest like >>71683953 said.

>> No.71684975
File: 563 KB, 843x674, Screen Shot 2019-07-01 at 10.25.16 PM.png [View same] [iqdb] [saucenao] [google] [report]
71684975

>>71683517

Raspberry Pi.

https://www.youtube.com/watch?v=W2F8Wa65_B4

https://www.youtube.com/watch?v=vzojwG7OB7c

>> No.71684991

Rosewill or whoever has one of those coin mining 4u chassis made for a bunch of gpus, less than a hundred. Retrofit a xeon w/ board from ebay maybe 300+,then spend thousands on gpus. Sounds shit tbhwu

>> No.71685035

>>71683517
>>71683638
Even a SBC cover most of your needs, plus low power consumption, low maintenance and very low noice.
You can make a NAS: https://www.cnx-software.com/2018/07/13/rockpro64-dual-sata-nas-enclosure/
And even put a decent GPU for machine learning on these things today: https://youtu.be/oXmqlDJTL5o
An alternative to this would be a mini-ITX or an used laptop motherboard, or whatever, without spend more than $150

>> No.71685039

>rent cheap server with port 80 open
>redirect every request to cheap server to locally running web server that has port >1024 open

What's the flaw in this approach?

>> No.71685060
File: 188 KB, 1341x653, jetson.png [View same] [iqdb] [saucenao] [google] [report]
71685060

>>71683517
>newbie

use a cloud provider's free tier to spin up your own web server and such
as an aside you'll learn azure/amazon or whatever provider you use
if you want to roll your own, and are just experimenting you don't need server-grade hardware to make an internal network with your own little DNS, DHCP, web and email servers
if you want to dabble in machine learning there is a single-board computer called the Jetson that is $100, has great software support from Nvidia and allows you to experiment with machine learning
don't get a server until you have a plan to completely fill it up, like which services and VMs you are going to run
remember that hardware is driven by software and if you don't obey this you will spend money and have shit laying around
nearly every server is too loud unless you are single

>> No.71685093

>>71685035
>GPU for machine learning
Any actual proof of this beyond individuals' blogs and YouTube vlogs?

>> No.71685368

>>71685093
I don't know quite right, since isn't my business.
That said, that question is very ambiguous to answer properly, there is a heck of dedicated hardware and frameworks for machine learning.
Even RockChip have released a iteration of the RK3399 (the one that powers the Pine64 NAS), the RK3399pro, focused on machine learning (enabling next generation SBCs for the task)

Here more AI chips:
https://github.com/basicmi/AI-Chip/blob/master/README.md

So, it depends largely on what you want to do and how optimized are the algorithms you would use.
If you can run GTA 5 on a SBC with a graphics card attached, it would be possible to do most machine learning related stuff.

>> No.71685414

>>71683659
You have to go back.

>> No.71685529
File: 136 KB, 1030x210, 3DCA329C-7F95-4C0E-B016-29F65B44879F.png [View same] [iqdb] [saucenao] [google] [report]
71685529

>>71683517
Just get a Syno bro

>> No.71685583

>>71683517

It is very obvious you don't know what you are doing as you want to do ML on a CPU so my advice is don't.

Never self host unless you really need it and if you'd really need you wouldn't be asking this question.

>> No.71685617

>>71684274
most of the (((popular))) machine learning frameworks such as tensorflow can be CUDA accelerated.

>> No.71685634

>>71685039
There is nothing with using a reverse proxy, just make sure that you don't lose any https encryption after the cheap server and youlre good.

>> No.71685646

>>71684274
CUDA basically.

>> No.71685655

>>71684604
you know you're wrong and you're just trolling at this point.

>> No.71686059

>>71683638
>it’s not worth the effort
don't listen to this fag

>> No.71686179
File: 85 KB, 940x587, r910-inside-web-do_8.jpg [View same] [iqdb] [saucenao] [google] [report]
71686179

>>71683517
>machine learning
oh do I have the chassis for you

>> No.71686228
File: 27 KB, 640x480, rsvl4500.jpg [View same] [iqdb] [saucenao] [google] [report]
71686228

>>71683517
does anyone have a cost effective solution to attach a bunch of drives to my main server? I have 45 3.5" hard drives.

I currently have them in three cheap $120 4u cases that each hold 15, and run straight sas cables to my main servers sas cards. Its messy as fuck and takes up so much space. Is there something that can deliver sata iii (6.0gb/s) speeds with like a proper backplane? Im seeing some expensive solutions that hold 24-48ish drives in a 4u spot

>> No.71686236

>>71686059
>>it’s not worth the effort
>don't listen to this fag
don't listen to this neet

>> No.71686260

(1) get a linode instance for the web server and (2) buy a high-core count computational machine for home (for your deepfake/deepnude computations).

>> No.71686275

>>71683517
>self-host a website

fucking why

>> No.71686785
File: 2 KB, 113x125, 1534537363164.jpg [View same] [iqdb] [saucenao] [google] [report]
71686785

>>71686236

>>
Name (leave empty)
Comment (leave empty)
Name
E-mail
Subject
Comment
Password [?]Password used for file deletion.
reCAPTCHA
Action