Hacker News new | past | comments | ask | show | jobs | submit login

I find that buying 6mths behind just under the state of the art gets you 90% of the performance often for half the cost.

I've been custom building my desktops since the 90's and I can't remember the last time I had an issue with hardware, I think it was a Geforce MX440 so ~2003.

The driver story on desktop hardware for Linux is absolutely great (in my experience) if the hardware has been out 6mths or more or uses a core chipset/device that has.




I have given this advise on Hacker News before: on eBay 2-3 year old workstations can be had for 300-400 Euro. Typically these are workstations from companies that replace machines every 2 years or so.

These are typically equipped with Xeon CPUs, plenty of memory, sometimes ECC etc. Moreover, since they are usually HP/Dell workstations, they are certified to be compatible with Red Hat Enterprise Linux, so all the hardware is pretty much guaranteed to work.

Just to give one random example:

http://www.ebay.de/itm/DELL-Precision-T1650-CPU-Intel-Xeon-E...

Xeon CPU, 32 GB RAM, 500 GB SSD for 425 euro.


I love buying old workstations, however power costs have to be considered.

For example my 2x Xeon 2670 (16 cores/32 threads) is quite the power hog.

Your example is pretty good since V2 Xeons are Ivy Bridge le vel and consume noticably less power than V1 equivalents.

However that particular CPU is basically i5 not i7 (just 4 cores/4 threads).


This is very cool. I wonder how much the electricity costs would be if I let that machine stay on 7/24 for a year. Power, in Germany, is a bit expensive (learned the hard way).


Unless you want your PC to sleep and wake back up. That still takes a while to get resolved with every GPU apparently.


Such a simple thing, (as far as a user is concerned) yet also a complete show stopper when it doesn't work right.


ACPI is a busted-ass standard. OEMs are free to, and do, do pretty much whatever. Again, the only relevant standard from an OEM's perspective is "does it work on Windows".


Last year I bought a i5-4690k, some slightly outdated but top of line mobo, 32gb DDR3 2400mhz and a bunch of 1tb WD Black... Total price was about 800USD yet it feels like I had bought something much more expensive, everything I throw on the machine runs fast, even the HDDs are fast enough that I don't feel need for SSD.

Only mistake was buying AMD 380x for GPU


What did you regret about the 380x?

I ask because I have been thinking about having my next card be an amd one because I'm tired of having to deal with proprietary drivers with my current nvidia card...


I was using only nVidia my whole life, and got tired of some of their business-related bullshit.

So I thought AMD was going to be better, because they try to be good and nice...

Well, AMD hardware is NOT good as nVidia (example: 380x in particular is really fast, but EXTREMELY power hungry, so much power hungry that AMD had to greatly cripple it, sometimes it starts stuttering heavily in games before it gets hot, and when I look at logs, the reason was it reaching power usage limits).

And they are bad at marketing, but also do the bad things that nVidia do at marketing, for example AMD shills do exist, I got banned from chat rooms after asking how to fix bugs (because they want to give the impression their drivers are bugless... but they are complete crap too, even their Open Source driver for Linux is so much crap it was entirely rejected by the kernel team), they deny their cards have physical bugs (RX480 has same issues as 380X, but ALSO has unbalanced power usage, drawing too much power from the mobo and damaging it), and so on...

I tried asking for help with my card issues with both AMD, Sapphire (the manufacturer) official and non-official channels, and I was treated very badly, people would ignore tickets, give me non-sense information, and several times they told me to just return the card and buy another one (I can't do that because I purchased my computer in US, but I live in Brazil, if I could do that I would have switched my 380x for a nVidia GeForce 970, back when I bought the 380x they were in several countries the same price).

Also AMD drivers don't crash the OS like nVidia ones do, but they crash a lot more, in all OSes, AMD drivers restarting (And taking your game/software with them) is fairly common, also weird error messages (like updater crashing, control panel crashing, etc...)


> their Open Source driver for Linux is so much crap it was entirely rejected by the kernel team

No, amdgpu was not rejected by the kernel team. A particular implementation of the driver was rejected because it implemented an abstraction layer, and that would make it nearly impossible for kernel devs to maintain.

> drawing too much power from the mobo and damaging it

If you could point to an example of this happening, I'd appreciate it. My knowledge of the situation is that some models of the RX 480 can run slightly out of spec, pulling a little too much power from the motherboard. Any motherboard I've heard of could withstand that. And if you really care you can enable an option in the driver that causes it to run strictly in PCI spec.

I'm not an AMD shill, I just think you've misrepresented some of the issues at hand. AMD make mistakes, for sure. But not every mistake is as crippling as you've implied.


"slightly" out of spec you mean pulling 7.7 amperes from a part rated to 5.5 amperes, and that might (due to dust and other factors) pull all the 7.7 amperes from pins that are supposed to have only 1.1 ampere running trough them.

Just look in the AMD own official forums for threads created before they made driver patches, for example in one thread a guy put a photo of his molten blockchain mining rig, and then there was several pages of people calling him a nVidia shill, and noone helping.

The handling of the incidents were so bad I stopped visiting AMD forums entirely, it was just pure hostility to anyone with any problem, even unrelated problems.


I couldn't find the post you were talking about (I was really hoping you could provide a link). Instead, I found a thread filled with people talking about how you have to be careful with your mining rigs because "Any electric appliance can catch fire." [1]

Bitcoin mining isn't a great example; if you look further into that thread, it's not just AMD users whose rigs have caught fire in the bitcoin mining situation.

1: https://bitcointalk.org/index.php?topic=1776853.0


for some reason I usually get downvoted when I bring this up, but I still believe NVidia is a better experience on Linux than AMD.

Like you, I always ran NVidia because of their support for Linux, but recently tried to use an AMD card for a Linux build. I ended up buying an NVidia instead, and all my problems have gone away.


For me, the big problems with NVIDIA drivers started showing up when I moved to a rolling release distro (OpenSUSE Tumbleweed). Proprietary drivers don't like frequent kernel upgrades.


The 380x and 390x are HOT. Newer Radeons are much better in that regard, and generally very decent cards - 470/480 and the recent rebrands/reclocks 570/580.


MY 380x only get hot when using the default fan control, that is complete crap.

With my custom fan curve it starts to get limited in performance while still around 60c (and fan noise is still not 'perceptible' over the sound of a game for example), because it instead hit power limits.

The 380x has same power limit as 380, despite having more GPU power available and double the RAM, I have no idea why they made such crappy decision.

Is power problems are so severe, that undervolting the card make it MORE stable and faster, because it reduces total power usage, and triggers the power limits less often. (same thing apply to 480 by the way, people found out during the 'PCI slot melts' crisis that undervolting it made it behave much better).


I own a 380x but it's in the corner gathering dust because of buggy drivers and crashes. It doesn't keep running long enough to get hot in my case.


When WiFi finally got to the point when 90% just works was huge. 2009?


Bluetooth can still be a PITA though...


It has gotten really good in the last 5 years. I only have been using OpenSUSE with Bluetooth though.


I still hit bugs with Arch and I know they can hit Ubuntu as well[0], for instance, If anyone can connect to an amazon echo as an audio device I'd like to know what version of bluez and/or pulseaudio they are using, as it stopped working after an update a while ago...

[EDIT] Though to be fair, I think they did call out bluetooth as one of the things Ubuntu was going to focus on in their next release IIRC.

[0] https://askubuntu.com/questions/871630/cant-send-audio-to-am...


I don't think WiFi is that common in desktop hardware.


In business not really, the majority of home users are on WiFi though.


Ok, I would assume this is something that might differ between countries. Here in Scandinavia I've met extremely few people running their desktops exclusively on WiFi since it's generally unreliable when having lot of devices talking to same AP/WiFi router in a noisy environment with lot of other networks taking up same frequencies. Actually, I would say running Powerline to desktops is more of a standard approach here.


Uhh.. Hi, I'm a Scandinavian (Swede). I got a desktop exclusively on WiFi. I've never seen an office or a home actually use "powerline" (Network over electricity network).. Oh well, one anecdata against another anecdata :-)


Swede here too, trevligt att råkas :)

Powerline networks are mainly utilized by people living in concrete multi-story houses that either do not wish to install a proper Ethernet backbone to all rooms or take a gamble with WiFi due to higher cost, thickness of walls or other reasons that might seriously affect connectivity as mentioned before.

I used to run my own IT support company with several employees (think Geek Squad) having both enterprise customers aswell as private sector and people running Powerline is actually a lot more common than you might think (my parents for instance are running it along with WiFi in their house - deskop, IPTV and camera surveillance is on Powerline while tablets and phones are on WiFi).

During all my years in IT I've encountered two situations which I can remember where WiFi was used on a desktop machine instead of Ethernet or Powerline - one was a car dealership where they had a salesman sitting in a "glass box" and they were sharing building with another company (so, no Powerline) and other situation was a enthusiast that built himself a new computer and his new Asus motherboard came with 802.11n built-in..

Personally, I am running all my desktop machines on a 10Gbit CAT6a network I have at home (I do have a WiFi as well but it is on another VLAN with no access to network infrastructure - mainly used by kids) where I have possibility to stream multiple 4K streams from my FreeNAS server while downloading huge files of the Internet without even breaking a sweat - try to do that over a WiFi connection and you'll hit into a brick wall pretty fast.


I'm in the US. I'm told houses are bigger and further spaced apart here. So perhaps that is why it's more common here.


The reason wifi is more common in the US has to do with cost.

Most people when buying a new house can stomach (not sure why, because it can't be a large portion of the overall cost of a new house) multi-line pulls from each room to a central wiring closet. Plus, you have to have that central closet (or panel at a minimum) somewhere out of the way, and most people just don't get that kind of tech (the idea of a central area for a home server, plus networking stuff, etc).

So - the lines aren't installed (at one time, houses were offered with the option, and if you are willing to pay today, you can still get it - but most people don't). After the fact retrofits aren't done because such an install is very difficult to do (especially in modern houses with horizontal firebreaks between the verticals, little to no attic with vaulted ceilings, etc) - which also means its expensive.

So instead, people go with wifi. It's cheap, no need for a dedicated wiring/network termination panel and/or closet, and can be taken down and taken with you if/when you move.

Personally, I prefer a wired system; when I moved into my house I installed a few drops myself where I knew there'd be some dedicated hardware (TV area, my office, library, and my shop); the other rooms I never installed anything because it didn't matter. For those, the wifi I have fills in those blanks adequately. I ran all the lines back to a custom wiring closet I built in my shop, and terminate everything there (plus a few of my servers live there too).


This sounds plausible to me. Here in Omaha, when I lived in an apartment and there were 30+ APs visible I had to be careful to pick the frequency based on what worked and what didn't, and when I did get it to be reliable I had short range. I am pretty sure this was just because of noise and cross-talk.

Now in a house I see maybe 10 APs and they are all at the edge of their range and I rarely need to tinker with it and it works all the way across the street.


On laptop, sure. But on desktop? I've never met anyone using something else than plain old RJ45


Happens all the time. Lots of people in apartments/houses who don't want an RJ45 running through the hallway nor do they want to pay for RJ45 wiring (particularly when renting).


Interestingly, in the last 10 years I've not met anyone who bothers with RJ45 in the home. Everyone in the UK gets a free wifi router with their broadband, and tends to just use that.

Not saying you are wrong, just different areas are different.


My parents live in a 200-year-old house with Cat6 ethernet in the walls. They needed to replace the electrical wiring, and decided to get it installed at the same time. I don't think they've regretted it, especially as thick walls attenuate the wifi signal. I'm pretty sure this is unusual, though.

More broadly, I think home desktops are getting rarer in the UK. Wifi is the obvious answer for portable devices with wifi capabilities built in, even in dense housing with lots of devices interfering with each other.


That definitely is unusual, but when you're doing a whole-house electric refurb (given what little you mention, it sounds like a tube-and-knob switchout, right?), you likely have everything torn up to hell and back, so you might as well fix or add anything else behind the walls while you can.


People do. One of my friends asked for a "USB to USB" cable last week. (you have how many phones but no USB cables?) Turns out that he wanted to connect the USB type A port on a WD My Cloud (NAS?) directly to his desktop, because connecting the drive via ethernet to the router and transferring over wifi from his basement desktop estimated that it would take 2 weeks.


I would say the vast majority of people use wifi for desktop. (Not myself personally except on the third floor of my house). You might be surprised by the percentages.

also don't underestimate the power of stealing your neighbor's internet. I have a fun little router that's named "dontstealmyinternet". I kept the router's default passwords but have it blocked. It gets about 5 attempts a month from new machines.


My < 6 month old, $4500 USD desktop has 2 x 1 GbE plus 10/100 BMC, but I put an Intel 7265 PCIe card in it and use that instead.

I do use one of the Ethernet ports but it just goes to another router next to my desk (connected via crappy Powerline to the rest of my network) for testing w/ KVM.


My media PC is on wifi because I rent the place and couldn't be bothered to lift the carpets to run a cable right around the room.

It's some piece of crap TP-Link but it works 100% of the time so far.


Consumer machines have WiFi pretty standard now, my desktop has WiFi. It came that way from the manufacturer. Not everyone wants to rewire their house to where they want their computer.


Hello! Nice to meet you.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: