Jump to content
HybridZ

Computer parts upgrade


datsun40146

Recommended Posts

Keep in mind that $380 was spent on core comonents only. There's also the hard drive, cd/dvd drive, case, psu, etc.

 

$380 on those parts is a decent deal though, aux didn't overspend. Though I just pray that motherbaord lasts more than a year... It's only taken me a couple experiments with low buck motherboards to know not to attempt it. I'll just stick to the very few brands I trust.

Link to comment
Share on other sites

Keep in mind that $380 was spent on core comonents only. There's also the hard drive, cd/dvd drive, case, psu, etc.

 

$380 on those parts is a decent deal though, aux didn't overspend. Though I just pray that motherbaord lasts more than a year... It's only taken me a couple experiments with low buck motherboards to know not to attempt it. I'll just stick to the very few brands I trust.

 

I've had 2 ECS motherboards last 4 and 6 years, respectively. One is still running in Capt Furious's PC when his PSU burned out his motherboard, the other is powering my arcade PC

Link to comment
Share on other sites

The only real problem I've had with my ECS board is that it keeps blowing capacitors. Not an insurmountable problem if you have a supply of the right type available (old motherboards = lots of caps) and can handle moderately fine (de)soldering work.

Link to comment
Share on other sites

well I know that there are some cheap motherboards that work fine.

I have a really cheap MSI motherboard that came bundled with an Athlon 64 3000 (socket 754 haha the last athlon64 3000 newegg had)

I've swapped my cpu out for my mom's 3200 and have it overclocked from 2.2 to 2.53ghz to be totally stable, but I can do a suicide run of a touch over 2.6ghz

 

You just needa take time and read all teh reviews you can to find out if that cheap board will turn into a frisbee, or a good buy.

 

the one thing I refuse to buy cheaply on are power supplies. get a cheap one and you run the risk of blowing out your entire computer.

man...i really want that 1000 dollar rig now. the Phenom 9550's have been getting REALLY good overclocks.

Link to comment
Share on other sites

My Dell mobo went bad and that was a $2300 machine! What a rip. They even wanted me to pay for it after a year and a half. I wrote to Dell. I spammed every @dell.com email address I could find on the web with my case and point. Eventually they sent me a free replacement....after I had bought a used upgraded board already. The new Dell Mobo is for sale.

 

I've had lots of machines wih ECS, Abit, ASUS, Gigabyte...mobo's. The only failure was the Dell board which was built by...CRS?. Can't remember Sh#t. forgot the name on it. I've also had GEIL Ram fail, and an Enermax power supply go soft.

Link to comment
Share on other sites

Ahem, I shall chime in a little.

 

I have a 939 socket complete computer for sale as we speak, and am willing to part it out as well.

 

CPU:AMD Athlon 3700+ , This is the 1MB model. It is Not dual core, and the memory controller got burnt out on it, so it can't run dual channel ram. =[

 

MB: DFI Lan Party UT SLI-DR, says enough, great fast board with a custom NB cooler, most NB run pretty hot, so it keeps things a little more stable.

 

GPU: Radeon x800 clocked to 600/600 at, doesn't go over 50 degrees under load. ( video games ) It has a custom cooler with arctic silver.. its a beeeast for being an old card.

 

PSU:OCZ Powerstream 520W Dual Rail

 

RAM: 2 x 1024 OCZ Platinum 2, PC4000, Dual Channel. 2-3-3-5 timing, Very fast ram.. it IS Not DDR2, but in all honesty, I can bench mark this Ram faster than most DDR2 Ram because DDR2 ram has slower latency timings than original DDR, take it or leave it.

 

HDD: Western Digital Raptor, 74GB version.. enough said..

 

Custom CPU cooler, tons of fans.. blah blah..

 

 

I wasn't looking for more than 300 bucks for the whole setup.

Link to comment
Share on other sites

Sounds pretty close to my system actually.

 

3700+ San Diego Clocked to around 2.4 iirc (been a while since I setup my clock speeds)

 

1GB Corsiar XMS Low Latency DDR333 (clocked to DDR410)

 

X800GTO2 Modded and clocked beyond X850XT PE specs. Was able to clock up to 650 on the core, and 620ish on the ram. Was hoping for more out of the ram but was most stable at 615. I feel that with a better cooler the core could take 700 on a regular basis. I was using a stock saphire cooler.

 

I run a 4 drive raid and achieve a 95% efficieny on my sandrasoft benchmarks, so if you took the speed of a single drive and multiplied it by 4, then multiplied by .95 you'd have my drive speeds. Of course seek time and such don't benefit from the raid.

 

Mobo is a gigabyte, can't remember the model. The DFI board I wanted had a bad batch that people were having issues overclocking with, killing CPUs. So I went with a model that at the time had been seening good results. From what I read DFI was doing a good job replacing the bad motherboards, I just didn't want to deal with the hassle.

 

I spent about 1k I think on this "budget" gaming PC. It's hard to remember exactly, because the case, PSU, and backup hard drives I already I had. I think I spent around $650 on the mobo, cpu, ram GPU, and the 4 hard drives. That was quite some time ago.

Link to comment
Share on other sites

That's cool that you scored a Hybrid SLI motherboard - check this out:

http://www.nvidia.com/object/hybrid_sli_desktop.html

 

With Hybrid SLI, you can plug in a "Low end" graphics card, and team it up with the motherboard GPU (Read intergrated graphics, but way they hell better than Intel's on board crap), to make the combined GPU horsepower better - think of it as a turbo :) Also, if you every have a high end GPU (check the chart on the link above), you can manually (and soon it will be app-detected) switch over from the discreet (high end) GPU to the motherboard (integrated) GPU, depending on what application you're using (game, video, web surfing)....saves a lot of power (and fan noise!).

 

Sounds like you've already bought the system so I won't recommend any parts...just be careful with your hard drive copying - I built a system for my in laws a few years back for christmas with XP....I transferred their old HD over from their old Dell PC with their files....what I didn't think about was that their old PC had Windows ME....and the Sasser XP Virus. BUT...because they didn't have XP, their Dell wasn't affected. The moment I powered up with the old HD contents, the Sasser Virus saw XP and went to work....that was a long day :)

 

Let us know what you end up doing.

 

CPUs aren't as important as people think they are anymore - they've hit a wall in terms of performance per dollar spent. No need to spend a lot of cash on a quad core or high speed dual core - your money is best spent on a good graphics card - while they are great for gaming as we all know...they are (and will be) great for other consumer applications:

 

PicLens (this is AWESOME):

http://piclens.com/

 

NVIDIA CUDA:

http://www.nvidia.com/object/cuda_home.html

Link to comment
Share on other sites

CPUs aren't as important as people think they are anymore - they've hit a wall in terms of performance per dollar spent. No need to spend a lot of cash on a quad core or high speed dual core - your money is best spent on a good graphics card - while they are great for gaming as we all know...they are (and will be) great for other consumer applications:

 

Words of wisdom. Though CPUs are still getting faster and faster, you really NEED to have software capable of using multiple processors, and that software is still limited. And for the most part in most desktop apps you'll never see the difference between an old single core and the highest end quad core anyways. I'm still waiting for the 8 and 16 core cpus... :-D

 

Now, when you're zipping a huge 50GB file you'll really notice the speed difference. Or unpacking it for that matter.

 

For multitasking it sure is nice to be able to tell a program to stick to one processor though. But that's getting kinda geeky now isn't it?

Link to comment
Share on other sites

Sounds pretty close to my system actually.

 

3700+ San Diego Clocked to around 2.4 iirc (been a while since I setup my clock speeds)

 

1GB Corsiar XMS Low Latency DDR333 (clocked to DDR410)

 

X800GTO2 Modded and clocked beyond X850XT PE specs. Was able to clock up to 650 on the core, and 620ish on the ram. Was hoping for more out of the ram but was most stable at 615. I feel that with a better cooler the core could take 700 on a regular basis. I was using a stock saphire cooler.

 

I run a 4 drive raid and achieve a 95% efficieny on my sandrasoft benchmarks, so if you took the speed of a single drive and multiplied it by 4, then multiplied by .95 you'd have my drive speeds. Of course seek time and such don't benefit from the raid.

 

Mobo is a gigabyte, can't remember the model. The DFI board I wanted had a bad batch that people were having issues overclocking with, killing CPUs. So I went with a model that at the time had been seening good results. From what I read DFI was doing a good job replacing the bad motherboards, I just didn't want to deal with the hassle.

 

I spent about 1k I think on this "budget" gaming PC. It's hard to remember exactly, because the case, PSU, and backup hard drives I already I had. I think I spent around $650 on the mobo, cpu, ram GPU, and the 4 hard drives. That was quite some time ago.

 

Nice PC bud! Mines a Sandy as well...

 

Nice clock on the x800 too.. I ran mine at 650/650 and it was getting unstable.. I actually flashed the cards BIOS to a x850XT PE but my card was freaking out because its not Really a Radeon.. its Saphirre.. =[

 

Oh well...

 

As for 64 bit windows... I never liked it.. mainly due to compatability issues.. and its only a hairline faster than 32 bit.

 

Hmm.. Unless something new has come out, correct me if I am wrong!

 

As for 64 Bit

Link to comment
Share on other sites

Words of wisdom. Though CPUs are still getting faster and faster, you really NEED to have software capable of using multiple processors, and that software is still limited. And for the most part in most desktop apps you'll never see the difference between an old single core and the highest end quad core anyways. I'm still waiting for the 8 and 16 core cpus... :-D

 

Now, when you're zipping a huge 50GB file you'll really notice the speed difference. Or unpacking it for that matter.

 

For multitasking it sure is nice to be able to tell a program to stick to one processor though. But that's getting kinda geeky now isn't it?

 

 

Hey Gollum :)

 

Why waste your money on only 8-16 cores when you can get a 240 Core GPU right now?

 

As for the programming software that takes advantage of these cores, it's arleady out there and free: "C for graphics": CUDA

 

http://www.nvidia.com/object/cuda_what_is.html

 

You guys are going to see some really cool stuff coming this year and next around GPUs and using their parallel processors for consumer applications.

 

How many of you guys transcode movies from DVD to iTunes, for example? Right now, Apple's $20 plug in uses 100% of the CPU to do this...and it takes like 5 hours. A company called "Elemntal" is coming out with a new transcoding plug in (using CUDA) that uses the GPU (if detected....if not, it will just use the slow CPU), which takes about...mmmm...30 minutes.

 

Read more about it here:

http://www.nvidia.com/object/io_1213610051114.html

 

There's also stuff coming from Adobe that's similar - performance gains you can't imagine.

 

CPUs are serial processors - GPUs are parallel - all 240 cores (240 on the high end graphics chips, fewer on the cheaper ones - but way more than 16...) can do jobs at the same time - not in serial like CPUs.

 

While you're at it...how many of you guys use Google images or Flickr to find images? Tired of leafking through tons of pags to see what the next 20 or so images are? Then check this out:

www.piclens.com (1 MB file). It will transform how you browse the web 0 I'm hooked on it. Oh, and it has built in features for GeForce GPUs that CPUs alone and integrated graphics can't do :)

 

I'm biased of course, but I am really excited about the technology coming out of NVIDIA these days.

Link to comment
Share on other sites

Opteron 165 1.8ghz @ 2.6 - 2.8 with 1.400 Vcore (1.35 Vcore standard)

 

down-clocked it to 2.6 because my case fans are poo poo :(

 

I think with a better or more silent cooler I would fire up the case fans again and get it operating at 2.8, but I feel no need, It's the OS that's being congested which causes the most problems because I haven't taken care of it via proper cleaning after removing programs.

Link to comment
Share on other sites

Anyone else read the recent article on tomshardware about ray tracing? Looks like we might actually see ray tracing in video games in the near future!!!

 

Sadly the software used today won't run on current Nvidia hardware. Not sure if it's actually a hardware issue or just driver issue. I'm sure if the industry goes that direction Nvidia will soon follow. I've really liked the Nvidia cars this last couple years, but it seems ATI's wealth of features nobody cares about paid off for a brief period.

 

I used to be a huge ATI fan, I no longer care anymore really. Same with AMD/Intel. After the AMD/ATI merge it seemed like all their stuff went down the pot, but they're making come backs. I've lost all my bias though. I've built too many systems to care anymore, and there's always trade offs with going with either brand.

Link to comment
Share on other sites

I'm not personally having issues.

 

It's more about the fact that ATI has been working at making sure they're compatible with features that most of the industry isn't ready for, or just plain not using. If someone made a game to actually use all these features then their best cards would be luck to maintain decent framerates.

 

You can check this article here for a brief overview of the new ray tracing on ATI people are talking about:

 

http://www.tgdaily.com/content/view/38145/135/

 

The software in question is mady by a company called JulesWorld. This is their site:

 

http://www.otoy.com/site/start.htm

 

WAIT: Reread the article. He said it can be made to work with the new 8800 cards, it just needed more work on the software side.

 

So I guess Nvidia won't need to be doing to much to stay compatable if a developer decided to go this route.

Link to comment
Share on other sites

Gollum - sounds like Otoy is very similar to PicLens (www.piclens.com). we work with developers to ensure our GPUs work well with their content - but, we can't optimize for every application of course - we only have so many engineers and there are so many software companies :)

 

Regarding Ray Tracing, there's an interesting recent article from CNET on that: http://news.cnet.com/8301-13512_3-9967175-23.html

 

 

June 12, 2008 2:00 PM PDT

Ray tracing for PCs-- a bad idea whose time has come

 

Posted by Peter Glaskowsky 3 comments

Dean Takahashi sent me an e-mail pointing to a piece he wrote on VentureBeat describing statements Wednesday by Intel's Chief Technical Officer Justin Rattner targeted at NVIDIA. CNET's own Brooke Crothers covered the same story and provides additional background here.

 

jrattner_90x124.jpg Intel Chief Technology Officer Justin R. Rattner

(Credit: Intel)

 

The technology at issue relates to 3D graphics for PCs. All current PC graphics chips use what's called polygon-order rendering. All of the polygons that make up the objects to be displayed are processed one at a time. The graphics chip figures out where each polygon should appear on the screen and how much of it will be visible or obstructed by other polygons.

Ray tracing achieves similar results by working through each pixel on the screen, firing off a "ray" (like a backward ray of light) that bounces off the polygons until it reaches a light source in the scene. Ray tracing produces natural lighting effects but takes a lot more work.

(That's the short version, anyway. For more details, you could dig up a copy of my 1997 book Beyond Conventional 3D. Alas, the book is long since out of print.)

Ray tracing is easily implemented in software on a general-purpose CPU, and indeed, most of the computer graphics you see in movies and TV commercials are generated this way, using rooms full of PCs or blade-server systems.

Naturally, Intel loves ray tracing, and there are people at Intel working to make ray tracing work better on Intel hardware.

The occasion for Rattner's remarks Thursday was a meeting for industry analysts at the Computer History Museum. At the meeting, according to Takahashi, Intel showed how a four-chip, 16-core demo system could play "Quake Wars: Enemy Territory" at 16 frames per second.

Honestly, that's pretty pathetic, since you can get higher frame rates with a dual-core CPU plus one good graphics chip. Your system price and power consumption will be a tenth that of the Intel demo system.

Rattner basically implied that Nvidia must actually agree with Intel that ray tracing is a good idea because Nvidia recently bought ray-tracing firm RayScale and Rattner says Nvidia is trying to hire away Intel's ray-tracing people.

Takahashi compared this conflict with the "Phoney War" of 1939-1940 and said the real fighting will begin when Intel introduces Larrabee, a CPU-based graphics chip, at Siggraph in August.

But I don't think there's going to be much of a fight there.

Intel is trying to defend a crazy idea-- that CPU-based ray tracing is a practical alternative to GPU-based polygon-order rendering.

We can guess why they decided to push this alternative--Intel's a CPU company and its people are CPU-centric. But the numbers don't work out: ray tracing takes more work than polygon-order rendering. Going from pixels to polygons requires searching (tracing rays), whereas going from polygons to pixels merely requires a relatively simple set of calculations known as "triangle setup."

Ray tracing's advantages for lighting effects are pretty minor; current graphics chips can be programmed to get good results there too, with less work.

I imagine Intel noticed that ray tracing could be another way to use the many cores in Larrabee, and figured this could be the basis of some competitive differentiation, but what should have been a minor point in some future marketing campaign has grown into an overblown strategic initiative.

On the hardware side, Larrabee isn't even optimized for ray tracing. On the software side, there's no support for ray tracing in Microsoft's Direct3D middleware, and no way any version of Direct3D in the foreseeable future will rely on ray tracing.

Larrabee will certainly support ray tracing--every CPU does--and some future version of Direct3D may support ray tracing as an option, but it could be 10 years or more before ray tracing becomes a required feature for any real-world software.

And to whatever extent ray tracing can be useful, Nvidia can write efficient ray-tracing code for its GPUs faster than Intel can tape out more capable versions of Larrabee. Nvidia is looking for ways to use ray tracing for lighting and other purposes, but this effort is minor compared to the work it's putting into polygon-order rendering.

Rattner is very smart--too smart not to know the situation. I think he's just doing his job, supporting his company's position whether he fully agrees with it or not.

digg_url = 'http://digg.com/hardware/Ray_tracing_for_PCs_a_bad_idea_whose_time_has_come';

And once Intel starts selling Larrabee, it's only going to get a day or two to talk about ray tracing before the focus will turn, properly, to Larrabee's performance on the technology that matters: good old polygon-order rendering. And at that point, I don't think Intel's going to have much to say.

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...