November 24, 2020, 09:36:31 am

Computer tech blog

Started by Dr. Mario, March 11, 2009, 05:28:30 pm

Previous topic - Next topic

0 Members and 1 Guest are viewing this topic.

Dr. Mario

Oh well, it's still good for few years. Beside that, GeForce GTX 290 is pretty expensive, so is its low-end counterpart, GTX 260.

Anyways, using AMD CPU will still give it some edgeness needed for games and streaming-computing, although ATI Radeon HD4k is much cheaper and much powerful than GeForce GTX 2x.

(BTW, I'm not NVIDIA hater, I still use GeForce and liking it. I'm disappointed that it will never use XDR DRAM, thus will never be able to outperform Larrabee and Radeon HD5k.)

Added after 23 minutes:

Sorry for the double-post... ^-^; Had to run.

Anyways, the reason I want to SEE the XDR-DRAM memory chips on both video card and directly linked to Phenom II's XIO controller, is because the DDR technology is getting overripe and fruitless: XDR can still outperform GDDR5, latency-wise. Anyways, both Intel and AMD already have their eyes on XDR DRAM.
;025 Now, Bowser... What can I do with you...

Smokey

Oh, i only switched to ATI because AMD bought them and most AMD boards now have crossfire...(that and i realized how much more fps you get for your buck ^_^), i never hated nVidia and i will not do so in the future,... ATI just has the edge in my situation... (AMD lover ^_^)
I dont tell you how to tell me what to do, so dont tell me how to do what you tell me to do... Bender the Great) :/
[Img disabled by Fedora-Tan]
Thanks Fedora-sama
Homer no function beer well without (Homer Simpson) ^_^

NejinOniwa

Seriously...what IS the edge that AMD has over Intel, then?
I didn't quite catch that.
YOU COULD HAVE PREVENTED THIS

Smokey

AMD processors are more efficient per clockcycle than Intel ones...

Simple example; Years ago an Intel core would need 6 cycles to process 6*6 and give 36 as the anwser while an AMD core would do that in less than half the cycles...
That plus AMD processors are easier to overclock and they're cheaper..
I dont tell you how to tell me what to do, so dont tell me how to do what you tell me to do... Bender the Great) :/
[Img disabled by Fedora-Tan]
Thanks Fedora-sama
Homer no function beer well without (Homer Simpson) ^_^

Dr. Mario

True. And its RISC core is much like PowerPC on steriod. Meaning Phenom II and Athlon 64 can be modified and blended together to make a true x86 Cell BE.

(This CBE would be different - its Athlon 64-based SPE would be a Out-Of-Order coprocessor, exact opposite to PowerPC SPE, mostly to deal with multi-issues.) while it will take too much work to rework Intel CPU to be turned into a CBE processor.

Knowing that AMD Hammer RISC is an advanced multi-issue engine, it can happen soon.
;025 Now, Bowser... What can I do with you...

Smokey

Yeep, and i havent (bothered to) read about Intel making a multi-role processor...
Wich is what i wondered for some time...If GPUs are so good in floating point operations, then why can't they assist the CPU with maths work when needed?...
I dont tell you how to tell me what to do, so dont tell me how to do what you tell me to do... Bender the Great) :/
[Img disabled by Fedora-Tan]
Thanks Fedora-sama
Homer no function beer well without (Homer Simpson) ^_^

NejinOniwa

I thought that function was being developed for the next-gen ATI chips?
YOU COULD HAVE PREVENTED THIS

Dr. Mario

Smokey, it can be done as of now. You will have to write a piece of software to have a CPU borrow the extra few gigaFLOPS from any GPU's DSPs. (Like NVIDIA's CUDA applications for example.)  Go with ATI Radeon HD, since it's a bit easier to program. (AMD have Radeon HD2k programming manual posted on their Software Developer website, although, I'm sure it will work with advanced ones, like now - Radeon HD4k.)

Most Blu-ray Disc player softwares do that, so it can process H.264 without drowning a CPU.
;025 Now, Bowser... What can I do with you...

NejinOniwa

OH, BTW
THE LAW OF H.264 MKV

You can fit the current-generation video resolution/quality on the last-generation media.

Considering a 1080p movie generally goes around some 10-15 gigs, I wonder what's the limit of h.264 mkv bluray... _W_
YOU COULD HAVE PREVENTED THIS

Dr. Mario

BD's file, M2TS is usually capped at 10-20GB (for single layered BD-ROM - depending on movie duration) and MP3 /AC3 audio file can be as large as 5 GB, and what's left of BD's free space is filled with softwares (sometimes Linux OS) and Easter eggs. BD's capacity is 25GB, that is.

And, both HD-DVD and BD's MKV can never be same: HD-DVD players use low-end CPU (paired with 128MB DDRII) while BD one use advanced CPU, most of times, paired with 256MB XDR DRAM. (why larger XDR memory? It's cheaper than DDR.)
;025 Now, Bowser... What can I do with you...

Smokey

Oooh, that'll make for epic performance, especially if you couple a couple of PCs together in a Beowulf, and put 2 or more vidcards in each, dedicating most GPU performance to assist the CPUs... ^_^
(heh, have 4 Radeon HDs in a system and use an old PCI VGA card for the actual graphics... :P)
I dont tell you how to tell me what to do, so dont tell me how to do what you tell me to do... Bender the Great) :/
[Img disabled by Fedora-Tan]
Thanks Fedora-sama
Homer no function beer well without (Homer Simpson) ^_^

Dr. Mario

Although it can be done, I don't think it's safe. In rare case, a old PCI video card will cause deadlocks in PCI Express x16 host controller (located in Northbridge chipset) since BIOS only support it as a primary VGA, only in emergency mode (that is, only boot sector running while the whole of BIOS' OS fails to boot, due to corrupt image.)
I doubt that using interconnected videocard as a primary VGA will consume much of horsepower. (normal usage in XP will only claim 50 MegaFLOPs for GUI, which is small.)
;025 Now, Bowser... What can I do with you...

Smokey

Oh, most beowulfs are run in linux... And linuses are supposed to be very frugal with a systems resources... ^_^
I dont tell you how to tell me what to do, so dont tell me how to do what you tell me to do... Bender the Great) :/
[Img disabled by Fedora-Tan]
Thanks Fedora-sama
Homer no function beer well without (Homer Simpson) ^_^

Dr. Mario

BIOSes are especially stricter with resource, it usually don't give a sh*t what OS you're using. It's mostly to hide some (internal) hardware and also code something to keep computer from going amok. I hate BIOS for many reasons: It's ancient, no matter how new MOBO is. It's so frigging buggy that it DO crash. Ugly MS-DOS 5.0 style GUI.

Apparently, Award Inc. do encourage open BIOS project like Coreboot (Linux-based BIOS)

Also I have written GreenOmega 64-based 64-bit BIOS (with real nice GUI)

Added after 16 minutes:

(Yea, my very own firmware is 64-bit and is AMD64 friendly.
But what about 16-bit bootsector? The firmware simply quits using Long Mode and rewrite some of handles into 16-bit ones, then deal with legacy OS like Windows 2000.)

Also, Smokey, Coreboot is the way to go, if you wanna have the bewoulf but is cussing at the regular BIOS. If you're curious, go to coreboot.org
* Coreboot firmware was also reported to have successfully booted Windows Vista up.
;025 Now, Bowser... What can I do with you...

Smokey

Oh, oops... I always saw the BIOS as the most stable and rugged piece of software in the entire system... ^_^
But the old DOS gui doesn't bother me, i kinda like DOS, just a shame that BIOSes don't live up to my expectations, now i read that...

BTW i'll go check coreboot out, sounds interesting indeed... Open source BIOS sounds a bit unnatural to me... :D
I dont tell you how to tell me what to do, so dont tell me how to do what you tell me to do... Bender the Great) :/
[Img disabled by Fedora-Tan]
Thanks Fedora-sama
Homer no function beer well without (Homer Simpson) ^_^