You can spill the bean on whether you like or dislike, what was your projects - I mean, anything about CPU, RAM, hard drive - anything about computer!
I figured that it would be a bit better to squawk about tech here, since some of us were not on topic. (mostly me, I admit.. ^-^; )
Do you actally need two PCI-e slots to have SLI or CrossFire? Cos as far as I can tell, everything I've seen only has one.
SLI or crossfire consists of a mainboard with two PCI-e x16 slots and an identical videocard in each of those two slots... nowadays it has more tolerances but basically it needs the same GPUs and preferably the same RAM, otherwise it would disable the extra RAM... (example if you use 1 GF8500GT 256MB and one GF8500GT 512MB you would have a total of 512MB of video memory, since the second 256MB on the second card is disabled...
Also nowadays, SLI and Crossfire have expanded to three or four videocards... i don't quite know how nVidia has named it but ATI speaks of Crossfire-x wich means you can now have up to 4GPUs (or even 8, but i don't know how the support of dualcore videocards is on SLI/Crossfire, maybe Dr mario knows) on your system, and to top it off there is also a hybrid version, wich allows you to use your onboard GPU along with one or more videocards (those videocards may differ from the onboard chip, but i suggest you still buy an identical pair of videocards, if you place more than one)...
So to use SLI/Crossfire, you will need a motherboard wich supports it by having at least 2 full speed PCI-e x16 slots or onboard VGA and an PCI-e x16 slot... so when buying watch for that SLI/Crossfire and also see to it it has enough full-speed PCI-e16 slots for your needs...
And as a final note, beware that ATI cards will not work on an SLI board and Nvidia cards will not work on a Crossfire board... So pick wisely... (i recommend AMD/ATI combination)
Pfew, long time since i had a rant... ^_^
That is in fact not entirely true.
The Intel X58 chipsets support BOTH SLI and Crossfire.
NejinOniwa, you're correct. The SLI and Crossfire both are the same, the only difference are: codes being fired out of PCI-e, firmware on video card and motherboard, AND the OS / driver used. The only best bet is: Still, be careful. They're still tricky to use.
And, for Octo SLI, it's only feasible if you have large case, and have at least 4 PCI-e slots (best bet is to grab a mobo with all 7 PCI-e 16x slot.) AND huge 1.0 kW power supply (they'll suck 70 amps peak when rendering advanced 3D mathematic)
Added after 14 minutes:
Oh, and watch what you're doing to your cooling system, Octo SLI get hot easily, because there are four fans sucking handful of air at fast clip, so use few larger and faster-spinning fan, to ease vacuum problem a bit.
Now, Red_Machine, to your question. Yes, you need two PCI-e 16x ports. If you don't have that kind of motherboard, and if you're in USA, you can try www.newegg.com and www.tigerdirect.com - they're cheap over there. (I have personally used Newegg, pretty cheap and very good.)
Red_machine is over in the UK...
And i have basically typed from my personal experience, wich was with earlier boards and that SLI chipset absolutely refused to boot with an ATI videocard...
Also if you have the cash to run octo SLI/Crossfire, i assume you'll also have the money for liquid cooling, wich is in that case the preferred option...
As I was saying in topicless I'm getting a laptop.
Family's gonna back me up, and I've got resources, so I thought I might as well go all out on this one.
Question - What should I get?
I have absolutely no clue on laptops. I mean, I know what I want - some hard stuff that can handle some gaming and HD MKV and so on, and also I want some awesome battery life. Also good keyboard since I'll be mostly using it for writing (I heard somewhere Acer has big keyboards).
Then. Brand, model, everything else, I've no clue.
That's the only one i could find so fast...
It has a Turion X2 @2,4GHz (of course it's AMD i refuse to look at intel stuff ^_^)
And a ATi Mobility Radeon HD 3850 with 512MB dedicated RAM (dedicated is important since shared is slow and for gaming, fail)
plus all the rest; HDMI-out, WLAN etc...
I know it's no Acer, but i couldn't find one with dedicated vidmem so quickly...
Added after 6 minutes:
Or this Acer...
It has an Intel Core 2 duo ;273
But it also has a nVidia Mobile Geforce 9700 with 512MB ram (also dedicated)
True. True, Smokey. And, IBM Thinkpad is also good but too expensive. Sometime making your own laptop (custom barebone) may help, but also expensive if you're not as careful in looking for one with dirt cheap deal.
If you're looking for horsepower, stick with AMD for a while, before going for Core i7, Mobile Core i5 / i7 is still in development (to be paired with Centrino 3)
BTW, NejinOniwa - it's alright, you're not really off-topic. ^-^;
Added after 15 minutes:
And, Mobile GeForce 9, AMD Radeon HD 3k / 4k, both are equally good for MKV post-processing (MKV is similar to Blu-ray Disc file, M2TS.) To ensure smoother BD video playback (stutter-free gaming including), try getting larger dedicated RAM too.
Also, avoid Dâ,¬ll, they don't last long nowaday.
Well, hell, most things on the market today have somewhere between 2 and 4 gigs of RAM, so memory isnt normally an issue...I'm having a hard time finding good graphics cards though (stupid sites without good filters!) so yeah. I'll be looking around for a while...and Smokey, that MSI model didn't even exist here ^^
Oh, oopsie... At least that acer does, and that can also steal your normal RAM if it needs more video RAM...
But, yes if you want a good processor, get an AMD...
Yeah, I didn't except the chosen GPU to be so hard to find. And of course, some AMD 7-series do have Radeon HD 3200 (Good for BD, bad for Crysis) - I have a Gigabyte MA78GM-S25HP (AMD 780G) - I know it's not laptop, but they do use the same chip in the laptop. I may be planning on getting either Radeon HD 5k or Larrabee for my computer, though.
Larrabee? Isn't that the in-the-works Intel one?
INTEL!!! *crosses himself*
Well, I'm not a big fan of Intel. But it left me with no choice, because I needed a GPU that a Anime3D (GreenOmega executible - more of GIMP, Blender, FASM tossed in a mixer) now need a GPU that it can program that itself with special 3D aspect data. AMD did their own Larrabee, I don't know if this project survives or not, anymore. (Supposedly Radeon HD5k - might use modified Athlon 64 as a streaming coprocessor..)
Added after 9 minutes:
As far as I know, both AMD Radeon HD5k and Intel Larrabee use XDR2 memory, mainly to handle heavy bandwidth. They're like the angry mule, only silicon-based nowaday! ^-^
Also, Larrabee is only to be left off as a optional part. It will be decided later once I get my thermal paste-covered hands on Radeon HD5k.
Gawd, and i'm still using a pair of GF8500s...This stuff is seriously outdated...
Oh well, it's still good for few years. Beside that, GeForce GTX 290 is pretty expensive, so is its low-end counterpart, GTX 260.
Anyways, using AMD CPU will still give it some edgeness needed for games and streaming-computing, although ATI Radeon HD4k is much cheaper and much powerful than GeForce GTX 2x.
(BTW, I'm not NVIDIA hater, I still use GeForce and liking it. I'm disappointed that it will never use XDR DRAM, thus will never be able to outperform Larrabee and Radeon HD5k.)
Added after 23 minutes:
Sorry for the double-post... ^-^; Had to run.
Anyways, the reason I want to SEE the XDR-DRAM memory chips on both video card and directly linked to Phenom II's XIO controller, is because the DDR technology is getting overripe and fruitless: XDR can still outperform GDDR5, latency-wise. Anyways, both Intel and AMD already have their eyes on XDR DRAM.
Oh, i only switched to ATI because AMD bought them and most AMD boards now have crossfire...(that and i realized how much more fps you get for your buck ^_^), i never hated nVidia and i will not do so in the future,... ATI just has the edge in my situation... (AMD lover ^_^)
Seriously...what IS the edge that AMD has over Intel, then?
I didn't quite catch that.
AMD processors are more efficient per clockcycle than Intel ones...
Simple example; Years ago an Intel core would need 6 cycles to process 6*6 and give 36 as the anwser while an AMD core would do that in less than half the cycles...
That plus AMD processors are easier to overclock and they're cheaper..
True. And its RISC core is much like PowerPC on steriod. Meaning Phenom II and Athlon 64 can be modified and blended together to make a true x86 Cell BE.
(This CBE would be different - its Athlon 64-based SPE would be a Out-Of-Order coprocessor, exact opposite to PowerPC SPE, mostly to deal with multi-issues.) while it will take too much work to rework Intel CPU to be turned into a CBE processor.
Knowing that AMD Hammer RISC is an advanced multi-issue engine, it can happen soon.
Yeep, and i havent (bothered to) read about Intel making a multi-role processor...
Wich is what i wondered for some time...If GPUs are so good in floating point operations, then why can't they assist the CPU with maths work when needed?...
I thought that function was being developed for the next-gen ATI chips?
Smokey, it can be done as of now. You will have to write a piece of software to have a CPU borrow the extra few gigaFLOPS from any GPU's DSPs. (Like NVIDIA's CUDA applications for example.) Go with ATI Radeon HD, since it's a bit easier to program. (AMD have Radeon HD2k programming manual posted on their Software Developer website, although, I'm sure it will work with advanced ones, like now - Radeon HD4k.)
Most Blu-ray Disc player softwares do that, so it can process H.264 without drowning a CPU.
THE LAW OF H.264 MKV
You can fit the current-generation video resolution/quality on the last-generation media.
Considering a 1080p movie generally goes around some 10-15 gigs, I wonder what's the limit of h.264 mkv bluray... _W_
BD's file, M2TS is usually capped at 10-20GB (for single layered BD-ROM - depending on movie duration) and MP3 /AC3 audio file can be as large as 5 GB, and what's left of BD's free space is filled with softwares (sometimes Linux OS) and Easter eggs. BD's capacity is 25GB, that is.
And, both HD-DVD and BD's MKV can never be same: HD-DVD players use low-end CPU (paired with 128MB DDRII) while BD one use advanced CPU, most of times, paired with 256MB XDR DRAM. (why larger XDR memory? It's cheaper than DDR.)
Oooh, that'll make for epic performance, especially if you couple a couple of PCs together in a Beowulf, and put 2 or more vidcards in each, dedicating most GPU performance to assist the CPUs... ^_^
(heh, have 4 Radeon HDs in a system and use an old PCI VGA card for the actual graphics... :P)
Although it can be done, I don't think it's safe. In rare case, a old PCI video card will cause deadlocks in PCI Express x16 host controller (located in Northbridge chipset) since BIOS only support it as a primary VGA, only in emergency mode (that is, only boot sector running while the whole of BIOS' OS fails to boot, due to corrupt image.)
I doubt that using interconnected videocard as a primary VGA will consume much of horsepower. (normal usage in XP will only claim 50 MegaFLOPs for GUI, which is small.)
Oh, most beowulfs are run in linux... And linuses are supposed to be very frugal with a systems resources... ^_^
BIOSes are especially stricter with resource, it usually don't give a sh*t what OS you're using. It's mostly to hide some (internal) hardware and also code something to keep computer from going amok. I hate BIOS for many reasons: It's ancient, no matter how new MOBO is. It's so frigging buggy that it DO crash. Ugly MS-DOS 5.0 style GUI.
Apparently, Award Inc. do encourage open BIOS project like Coreboot (Linux-based BIOS)
Also I have written GreenOmega 64-based 64-bit BIOS (with real nice GUI)
Added after 16 minutes:
(Yea, my very own firmware is 64-bit and is AMD64 friendly.
But what about 16-bit bootsector? The firmware simply quits using Long Mode and rewrite some of handles into 16-bit ones, then deal with legacy OS like Windows 2000.)
Also, Smokey, Coreboot is the way to go, if you wanna have the bewoulf but is cussing at the regular BIOS. If you're curious, go to coreboot.org
* Coreboot firmware was also reported to have successfully booted Windows Vista up.
Oh, oops... I always saw the BIOS as the most stable and rugged piece of software in the entire system... ^_^
But the old DOS gui doesn't bother me, i kinda like DOS, just a shame that BIOSes don't live up to my expectations, now i read that...
BTW i'll go check coreboot out, sounds interesting indeed... Open source BIOS sounds a bit unnatural to me... :D
Yea, baby - Coreboot's based oo uCLinux (microcontroller verison of Linux OS)!!
BTW, It's unique from the other uCLinux OSes, is that it have access to SMM, and dedicate most of its MMU jobs to on-die memory controller (on AMD's 64-bit CPU, and Intel Core i7) and have widest selections of drivers for soldered-on parts. AND, yes, it have an EFI utility software (works perfectly on 6x and 7x86-based motherboard with Coreboot written on BIOS flash EEPROM chip.)
Added after 15 minutes:
Also be careful with Phenom II CPU, because it potentially have XIO memory controller, when dealing with Coreboot. (I have seen it kept posting memory controller error message on Phenom II, so I tried Cell BE initalization manual, copied XDR memory initalization and combined it with DDR initalization, so it can try either - booted up fine, regardless of L2 cache being used as a main memory at booting point.)
Also I liked that idea of writing your very own firmware (BIOS), so I wrote few, based on GO64 OS.
Well, i know that if you were to make a MoBo from scratch you will have to write the firmware, and i guess, not just for the BIOS...
Well, yeah. Actually, the BIOS is only firmware on many motherboard, other than processors' on-die boot firmware (same ROM/EEPROM area where CPUID, and instructions are located. On AMD's CPU, it's at 00x0000 [same as ARM CPU], primarily to boot small but important firmware, 16KB in size.)
Also Coreboot is great for any Linuxphiles who build the motherboard.
(All hail to Kami-Tux!!)
Also, having experience writing BIOS, you will want to find every single tech spec on every chips on MOBO, for drivers.
Added after 16 minutes:
That is, the drivers in BIOS image, original AND homemade, are to maintain the hardwares during boot point, to boot OS successfully, with all of hardwares you want to be in its hardware stacks - mostly in EAX stack (providing anyone know ASM, and C++) so you can use a motherboard to its fullest intention.
Oh, and if you wanna the best GUI, please reconsider the memory size of BIOS flash EEPROM, after you focus on the OS kernel and drivers. Most of time, I'm satisified with OpenGL rendered in VESA 3.0 VGA.
For great efficiency, one could also merge BIOS and OS on the same chip, right?
Although having the OS do absolutely everything, from booting the entire system, to running it when it's all up, would require it to be on fast memory and it needs to be programmed extremely well....
Once you get something like that running, though, i think you could really speed booting up ...
Have done it. The copy of GreenOmega on the hard drive can use the BIOS' kernel of the same nature. It's to eliminate Real-Mode boot (dry boot), instead, it can execute any of the softwares without a help of EFI utility. It can also load any programs off hard drive, CD, DVD, BD, and many kind of the most common storages, from just plain command prompt or a mouse click. The iso-kernel also helps with CPU emulation very well, like running MacOS 9.
Exactly the same process can be done by Coreboot too.
Oh, sweet... (and typical me, btw. everytime i have a bright idea, it already exists... ^_^)
Yup. Also I'm trying to take my own firmware's usefulness into account. What about language? It has English and Japanese built-in, as of a special requirement.
You said fast memory? I tried it on a CPU (Phenom II with its memory controller unlocked) - the kernel fits fine on Level 3 cache, but when it comes to avoiding dry bootup of GO64 off a hard drive, it get written off, onto either DDR3 or XDR (in this case, 3.2GHz XDR - just to see if it complain or crash. I took bug in 65nm Agena seriously.)
Wow, on-die booting?...
Well i guess you could fit a DOS-like OS on there...
YES, it can be done, on Phenom II's L3, it's 6MB. (And on Core i7, around 8MB, just in case Anybody is interested in what's on i7's die.) Also for a Japanese x86 RISC (not released yet - set my thermal paste-covered hand on it already) - Cell BE version, has a monstrous L3 cache. Back to DOS, in Real Mode, it only take up 1MB (except for modified DOS that runs in Long Mode) that it will fit 1/6 or 1/8 of operating L3 cache's available area. BIOS I designed, consumes 2MB, even in Long Mode.
My god, that'll make for an insanely fast OS... Should boot in nanoseconds... :D
And who needs a flashy GUI, as long as the games are supported...(by having the OS load a DX (-like) subsystem from SATA disk when needed... ^_^)
Well, my BIOS has a flashy GUI (blue glass style), pulled it off with only 256KB executible code size (it also have Win3.1-style GUI).
Also, I did PS1 emulation, with only original Sony firmware image, and spaghetti of commands, strictly text-based - ran PS1 game fine.
Then why is Vista such a HDD-hog?!
True. But Vista's kernel is not to be labeled a HDD hog, though. It's more of the opposite, a RAM hog.
My GO kernel is written within 256KB, up to 1MB, loads only what is needed (For example: CPU driver, like Hypertransport, and/or RAM controller, to assist in pre/post-booting.)
Well i find that if your OS occupies 41,57GiB, it can be labeled as a HDD hog aswell....
Well, it's not easy not to make it a hog. At inital, it's 400MB. My HDD containing GO64's approaching 67GB, as a result of software developmental and anime CGI, and many other software - like OpenOffice.Org, just to name few.
Oh, i havent installed anything on my OS partition... no i do that on other disks... Only thing on my OS-disk is the OS, its updates, and maybe a GiB max on stuff that absolutely had to be installed on that disk...
Oh well. I like to keep important softwares on the same HDD where GreenOmega 64 is located, so it make it easy to keep track of.
Well. i couldn't even if i wanted too... ^_^
That partition is only 46,5GiB... Wich i thought would be plenty for Vista...
My ME partition is 30GB, and I have the rest (the drive is 40GB) as unpartitioned space in case I need it. I was originally gonna have an AROS partition, but even though this machine is single-core it still refuses to boot!
It sucks sometimes. Oh well, pains do inevitably go away as the HDD price go down. =_=
Oh right. Most of time, the boot sectors are written differently for particular OS, like you said, Windows ME and AROS. Sometimes it's better to have seperate HDD, otherwise experimentation never hurt.
True i couldn't get a multiboot working with Fedora and Vista ... Native bootloaders refused to boot the other os... Heck, grub bootloader refused even to be installed on a ntfs disk... :/
Yeah. If you wanna do LILO/GRUB combo on NTFS, it has to be done on Linux installer.
Or make a small partition, format as FAT and toss grub in there... ^_^
Yes, it can be done, mostly notarily, because FAT is the most common FS, be it written by a commerical, GNU'd, and/or homemade OS kernel.
As far as i know, all grub needs is a 32MB fat (16 if i'm right) partition to have him do it's magic...
About that. And, if you wanna do EFI verison of GRUB, you will wanna leave 200MB FAT partition, that way the firmwares other my own BIOS, could tell that it's for EFI boot. (My GO64 AIOS firmware doesn't care, it would just simply hunt for EFI program file and then execute it, thus booting OS up.)
Wow, BIOS has much room for improvement indeed, but i will still say it's a good piece of software, since it has run millions of motherboards throughout the years without many major changes... Unlike some OSes... ^_^
BIOSes stem from the original operating systems. So what you're looking at when you pres DEL at the OEM logo, was originally intended to be the GUI for really old IBM Compatibles and the like.
I don't have an OEM Logo, i make my own... Come to think of it, i could load my own logo in the POST screen...
And the BIOS menu couldn't be from that old computers since the interface from an IBM 7000 series was way different... ^_^
True, my friends. It came from DOS. But I wrote my own BIOS, and this one is no expectation. Its kernel isn't of DOS type, if you have Coreboot or have used it, you will get the idea.
Anyways, I might post the screenshot of GO64 AIOS booting up, as I get a chance. (and maybe a nice video of it booting up a verison of GO64 or Windows 7? If you really want to see a x86 Cell doing its job, I might, since it's sitting in my favorite test mobo, which I will take picture of my own firmware from.)
Fav MoBO? wow i wish i could say that...(well technically i can since i also have my laptop ^_^)...
Makes me wonder btw what makes those ATC radar systems tick?
Okay, here goes: First, the RADAR (digital verison, the one that's entirely fool-proof) shoots out pre-determined pulses of Microwave radiation - scattered all over the area being systematically scanned. Second, a special sensor (or you could say, multiple sensors in one multi-directional module) receive the first ping, then is set aside, as a reference for the FPU, and keep collecting multiple pings, storing all data onto RAM.
Added after 12 minutes:
Third, the CPU start to pull the pinging data, similar to Etherent pinging, then bring up the reference point data, then have its on-die FPU calculate every stopwatch codes, and put FFT formulas out to paint 3D RADAR image, sending remaining fully-digested chunk of data to next processor, GPU.
Fourth and the last, GPU perform a special mathematic formulation (You bet your sweet ass it's nearly the same as what's done by Crysis.) to take distance and shape into account of RADAR imaging. There ya go!
Well, a little more elaborate that i imagined... and quite useful too... ^_^
Eehm, well leaves me to thank you... Thanks... :D
Okay, back to "favorite motherboard"
Why is it my favorite? It's easy to deploy any boot software onto 16MB serial BIOS firmware flash chip. It also have both internal and external JTAG port, that is, I can choose to probe it or have it do itself, very useful for debugging. And it have eight XDIMM slots, that I can keep adding more larger XDR memory modules. Hmm, I like big memory capacity!
O_O 8 Dimm Slots?! Oh sweet, i usually only see that on dual socket boards...
Yeah, and the CPU in the future might have native quad channel memory controller, in this case, four XMC blocks, that is, four XIO controllers.
And, having 8 DIMM will do us good, because we can just add the memory modules as large as we want.
Hmmmm, delicious RAM...
I always drool away when i see those MoBos with lot's of RAM slots...
Imagine, 16 slots with a 4GB XDR Dimm in each one, coupled with a nice Phenom X4 Black edition... ;028
O_O That will have sustained bandwidth on Dual Channel XIO, on either Phenom II X4 or X6 Black, 600 to 950 GB/s, that is, 8 XDR DRAM modules on individual XIO, for 16 XDIMM all filled, Xeon Dunnington will stand no chance, pitting against this AMD CPU...
That's LIKE pitting BB gun against a sawed-off rifle?!
Only example i can think of with extreme bandwith bottlenecking is a game of C&C TibSun i once played over a 10Mbit ethernet hub... Blew the entire network away... :D
The same is true of multicore CPU's FSB (both memory and HyperTransport/QuickAssist) - to make the best of it, sometimes, is to break the throughput, in a same manner you would want to tame a p*ssed-off mule. At least AMD made the best use of what's available for their Hammer RISC-based CPUs.
right, because that was the second thing i thought of when i thought of multicore/oversized core CPUs (that was years ago when multi-core was still unheard of, but probably in full development ), is the limited number of pins a die supports...
Sure, die can handle having a lot of pinouts, but the bonding wires actually have physical limit, the size is a problem: We cannot shrink the diameter of Copper/Gold wire without snapping it as easily.
So, we would want to try and avoid this kind of problem, is to put memory controller and host Northbridge circuitry, to reduce the number of pins on die as much as we can.
But the bonus is, we will have pretty fast data traffic around CPU core and outside it, thank to serial circuitry design.
QuoteWe cannot shrink the diameter of Copper/Gold wire without snapping it as easily.
That's why we need to start using Carbon Nanotube technology already. -w-
eehm, if that were to be used now in commercial models, we'd be paying the same for a CPU as we did back then for a 386... some 3 to 4 thousand dollars...
Let them figure out an effective way of producing that first....
Sorry to bring it up, but Carbon Nanotube is naturally a resistor by nature, because Carbon is next to worst conductor - unless it's made with Copper lattice. Copper is #1 the best conductor.
wasn't that silver? at least that's what i've seen (waaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaay back) in science class once... That silver has a little higher electrical and thermal conductivity...
The today's science has proven that Copper's actually the first, Silver the second, why? Because Copper has specific crystalline structure. They even also tried to heat it to see if they would witness any drop in Ohmic value. That's also why Athlon K7, throughout to Phenom II have interconnections made of Copper.
And Intel named on of their cores Coppermine... Interesting, i always believed it was a price issue... ^_^
Unfortunately, Intel didn't use Copper in the Coppermine, until Tualatin's 130nm SOI process forced them to. (That was to prevent the voltage consumption from going out of whack. Remember why Intel recalled the toasty 5 Volts DC Pentium?)
ahhh, those lying bastards....
At least old processors were noble, with lotsa (in comparison) gold in them... ^_^
Yeah, I know... -_-;
Intel always get in big trouble.
BTW, Pentium is also indeed a interesting CPU, because its FPU can perform precise mathematic processing, beating any of TI calculators.
:/ wait, isn't that something any CPu should be ablt to do since the last ten years or so?
Yeah, and Athlon K7 was no exception.
And also I wonder, what should I do with Cell processor in one of my workstation. Hmm.
What, no exception in being more of an electric grill than a processor? ^_^
I remember those early athlons, they could get really hot without complaining, because they had no thermal protection circuitry...
Yeah, and saw some of 'em commiting seppuku on Youtube video when a dude removed a heatsink - only Intel CPU survived that by slowing down.
Couldn't think of what I wanna throw at Cell processor - Too many things to imagine... (drifting in euology) BTW, what software do you think I should look for, while demostrating Cell's power. Prime95's useless, it's deemed too easy for this kind of CPU. Open Source, please (since if I get to use a x86 Cell BE, I might want to port it into x86 architecture.)
SuperPi or HyperPi (http://files.extremeoverclocking.com/file.php?f=36) is very popular in the overclocking community, they run it and screenshot it with it running alongside a monitoring tool wich shows temperatures etc.... And of course some sort of benchmarking program showing the Tflops and Tips...
Seem good enough. Also I would run Prime95 to see what's the difference between those program while running it on multi-core SPEs (NT kernel would be run on XPE core [x86 Processing Element], naturally because it's a bootstrap processor. While that means XPE is the core that boots up first, it has easier control over SPEs within Windows.)
I may post a screenshot of x86 Cell BE working its magic on GreenOmega 64 (with command-prompt based FFT calculation), when I grab the chance to do so.
Cool, please do... Meanwhile, i will be trying to get my very first Mac up and running...
Bought myself a powerMac G3 at a second hand shop while i was actually looking for old parts for a Duron 1100 for a friend... Only â,¬17,50, so i just couldn't resist... ^_^
Added after 59 minutes:
Okay, although i got it to boot up (finally, i had to put the Graphics card in a 64bit slot and remove a faulty DIMM), but now i am greeted by an icon wich alternates between an Mac smikley and a questionmark...
Yay Mac for being so specific about your errors... :(
Okay, it happened to me - it does have the same problems as the PC. (No surprise, really.) What you're seeing is that some parts has failed POST or it's looking at bad boot sector. Just be sure that you're using the apples-certified parts, which contains PowerPC instructions onto it. If so, first try replacing IDE / SCSI cable onto the hard drive. If still unsuccessful, format the hard drive. There are rare chance the Macintosh G3 / G4 system at the pawnshop containing dead hard drive, so please check.
Thought so, i was thinking of reinstalling OS9, to see if that helps... And i reckon that the installation disc of a MacOS would have similar disk tools that a windows install disc has...
That system has been upgraded though, there is more RAM and an 80Gb harddisk alongside (i think) the original...
But i haven't tried everything yet, so i will continue to work on her...
This is my first Desktop Mac, so i will not give her up that easily...
[dramatic doctor] She's still alive, i will not lose her, dammit![/dramatic doctor]
Added after 55 seconds:
Also, i was pleasantly surprised to see a Socket A in there...(I always open up my PC's before i do anything else ^_^)
O_O WTF... Macintosh G3 have Socket A?! (Excuse me if I'm wrong, because I only have worked on G5 one.)
And if it DOES have Socket A, then Sonata have found her match with Athlon XP-kun! XD
And I have done MacOS9 emulation, it wasn't too difficult for AMD CPU - ran it on both Athlon 64 and x86 Cell, with a custom firmware - booted up fine.
It may be a bit hard on Intel CPU, because MacOS dislike CISC processor, and the CPU mentioned above, are RISC.
Eeeehm, no... Well, yes and no...
Although it has a Socket A, i guess that it stands for Socket Apple or something, you know, to confuse PC users... It has Third Generation PowerPC Chip...
Here's some pics, btw...
Added after 1 hours 54 minutes:
I got her to work by the way... Aparently the previous owner was very sloppy, or couldn't get her to work again, because now i can delete all his personal files... (well except for the music, i'm keeping that ^_^)
But i'm happy that i got her to work (without crashing in 10 seconds to 3 minutes ^_^)... Took me a while, needed to replace a dimm, rejumper the harddrive, and remove the PCI NIC, wich was unneccesary anyway, since the Mac has a built in 10/100base tx lan port...
I am pleasantly surprised by the onboard speaker, btw... produces nicer sound than those onboard PC-speakers (not hard to accomplish)... And a bit less pleasantly surprised by the sheer weight of the thing... 27 pounds! (13Kg!!), wtf did they do to get it so heavy?!
Well, thank for correcting me... ^_^;
Nice motherboard! That's good that You got her to boot up, now that she's usable.
Also, it looks like you can use Athlon XP cooling fan on G3 daugther card - just be careful about the fan clip, though. (you can use original heatsink clip.)
Now, my buddy, why is your Macintosh case heavy? It's made of Japanese cold-roll steel. I have PC case that's made of the same material, it's heavy, but pretty durable. It took few beatings already and the innards are fine.
Ah, right... I like my cases durable, so that's another plus for my Mac...
BTW, i am currently typing this on my mac (wich runs OSX 10.3.9 pretty smoothly), I have decided that i would give my PC a break and hook my Keyboard, wireless mouse (Microsoft of all brands ^_^) and TFT screen to the Mac, wich works pretty fine, although i have to get used to the fact that what used to be [ctrl] on my windows machine is now [windows key] on the mac... ^_^
Also i achieved major stability improvement by replacing the 80wire rounded ATA cable by an old-fashioned 40wire flat one... (never hurts to keep some of the old stuff, i even got SIMM 32 and SIMM72s ^_^)
God i have to say it is really quiet now in my room... ;010
Quotei have to say it is really quiet now in my room...
DAMN YOU SMOKEY
Hehe, i had to cool my other PC with 3 120mm fans, 2 80mm fans and a 240mm fan... so yeah it is very serene now that i gave my PC a break...
:smoke: At least I don't have to complain at all, because I'm already deaf.
But if I decide to use my CI, I could hear my computer, because it has two of Sunon's powerful 80mm fans, the one that kind of look like a turbofan engine - one to suck cold air in, another to blow on a modified AMD Phenom retail heatpiped heatsink. When my mobo decides it need more air, it will start to really sound like a turbofan engine accelerating.
Oh well, it's better than having toasted CPU.
Added after 25 minutes:
Okay, pulled the picture of a rather powerful fan off goldmine-elec.com - My PC has two of 'em, exactly the same as what's in this picture.
[attachment deleted by admin]
;013 Srsly, WTF... I've been trying to get my PC to sound like a Pratt & Whitney Turbofan (http://en.wikipedia.org/wiki/Pratt_%26_Whitney_PW4000) for ages... I likes the sound of a turbofan starting up , so....Sounds like power.... ^_^
This fan does... That was why I had to plug it in aside the voltage regulators on the board so no one would complain. Be warned, though, this fan makes loud whooshing sound. If you like the sound of P&W Turbofan engine, this one is definitely the one to experiment with. Also, this fan that size (80mm) blows a gale wind at full voltage (at 12 Volts DC).
Heh, heh, heh... I will hunt two of those down and put them in my case... ;006
Here's a hint that will make your hunt a bit easier: Go for Vantec Tornado 80mm Case Fan. I got this fan this way.
Also, best of all, It contains two ball bearings, which this fan will last a long time.
Ah, nice... i really have come to hate those sleeve bearings... they get noisy the wrong way...
Man when i'm through with this system it's going to be the noisiest in town... Maybe i should go for liquid cooling to add more fans...
Me too. I despise the sleeve bearings, they almost always ruin a perfectly good fan. Plus, they have a pretty short life, due to the bearing putting a damper onto spindle speed, which also put a strain onto the BLDC motor driving chip. I'm sure the powerful fans that I put in will outlive the power supply fan.
If you think your new fan's too loud, use a fan funnel, it will help. I use mine, sandwiched with 80mm to 70mm funnel, mounted on AMD heatsink (with original fan removed).
Added after 26 minutes:
Here's a picture of a retail heatpiped heatsink that came with Phenom CPU. But I don't own Phenom yet, mounted it on Athlon 64 X2 anyways. Someone wanted me to build a computer, he let me keep it anyways, because the fan spun too slow to satisify me, and he had another aftermarket fan to mess with. I replaced it with Sunon 80mm fan, atop the fan bracket which used to hold 70mm fan, via a clear blue plastic funnel.
[attachment deleted by admin]
Well i'm keeping my Athlonx2 4800+ cool with one of these (http://www.thermaltake.com/product_info.aspx?PARENT_CID=C_00000791&id=C_00000792&cid=C_00000010&name=BigTyp+120+VX&ov=n&ovid=)...
That 120mm fan is nice and powerful... And that fancontroller is nice just in case i want a bit less noise... ^_^
Yeah, I originally wanted a montrous heatpiped heatsink like that one, but couldn't afford one, so I modified a retail heatsink. It stayed real cold (due to air moving through the fins fast enough), so I'm satisified with it. Therefore, I'm using AMD's Dual-Core CPU, so it doesn't produce that much heat. Phenom's another story... (but Phenom ran fine with this heatsink when I took it in for a test drive, in my PC.)
Well, if AMD says that that block can keep a Phenom cool, it can keep everything up to and including a Phenom cool...
BTW, i actually bought that block because the shop i went to didn't have the Liquid Cooling systems i wanted... ^_^
But there's a webshop that still sells the Thermaltake Volcano... --->
It's a heatpiped LCS block that can accomodate an extra fan so it will serve as an extra radiator, too... Couple that to a Bigwater 745 kit (http://nl.thermaltake.eu/product_info.aspx?PARENT_CID=C_00000402&id=C_00000403&name=BigWater+745&ov=n&ovid=) and you're set to cool even the most demanding CPU's, or a cpu + crossfire setup... ^_^
I also have two 5,25" bays left (there are 6 occupied by fans -2x120mm-, one by the power/reset/audio/USB bracket and one by the DVD drive) so i can also install an extra BigWater 76IS kit (http://nl.thermaltake.eu/product_info.aspx?PARENT_CID=C_00000940&id=C_00000941&name=BigWater+760is&ov=n&ovid=), wich saves me the hassle of placing a loose pump+resevoir and gives me another 120mm radiator... giving me some room to overclock... ^_^
Looks like it could cool a six-core or octo-core Phenom II processor. I'm only to be using Athlon 64 X2 system for a short time, then use it in a TiVo box as I'm browsing through new RISC x86 CPU that could satisify me, also the one that could support PC XDR-DRAM modules, and have a pretty good headroom for overclocking too. I may be playing a rather realistic video game, so I probably should grab 16GB XDIMM memory module if I'm to build another PC in the near future.
Well, the wishlist for my next PC should make for a good gaming system, and will drain my wallet for the next couple of months... ^_^ Here goes:
*Motherboard:Asus M4A79T Deluxe (ASUS, for obvious reasons, and the deluxe model also for obvious reasons )
*CPU :AMD Phenom II X4 810 (should be fun, to go from 2x2500Mhz to 4x2600MHz ^_^)
*RAM:2 Corsair 4 GB DDR3-1333 Kits (i know, "just" 8GB RAM, but it should be fine, I'm running on 4GB now, so...)
*Videocards: 4 Asus EAH4830/HTDP cards (that is to say, if the store isn't lying and they do support crossfireX)
*cooling: a Thermaltake Volcano with a Thermaltake Kit BigWater 745 and a Thermaltake BigWater 760i (for nice coolness, lots of extra fans - i love my fanbase ^_^- and overclocking purposes...)
*PSU:Tacens Supero 1000 (because i think my 450Watt won't cut it with all that new hardware, heck i will need 4 PCI-Ex16 power connectors wich the Tacens has and my current one has never even heard of... ^_^)
Totaling â,¬1325,- ex shipping and handling...
So yeah, cost alot, get alot... ;010
Yeah, it definitely looks like you would be like a kid on Christmas Day, screaming surprisingly when you see a ton of new presents - at least you will have a nice system to fool around on.
Yep, and one heck of a nice "old" PC, to experiment with Linuses, Unixes, heck maybe i'll turn it into a nice domain server, i guess the WiFi-AP solo software is supported by 2000server, wich would mean we can finally go wireless here in the house (I have 3 routers, wich don't work with the modem/router we have. and i don't want to buy a different modem/router because i believe my sis has lost the account info)...
Yo, smokey! On the WiFi-AP Solo thing - how taxing on the system is it to act as the network router, anyway? I mean, is it noticeable at all?
Not too taxing, still depends on how much devices (such as Playstation Portables, PDA with Internet capability and many others) - it can be too taxing on the modem, so if you want it to flow smoothly, use DSL or cable. And You can choose the setting, either be totally visible, "Free all-you-can-eat buffet", visible but encrypted, or totally hidden and encrypted.
I don't think Nej has to worry about bandwith, he has Broadband, Sweden Style... ^_^
And no it's not too taxing, a proper AMD dual-core CPU can handle network, DNS and DCHP easily... And of course have performance to spare for other stuff...
Well, I mean like this:
DELTA (quad 2.5 intel) acts as a router for 3 devices. These share a network bandwidth of 300 megabits. Is the operator of DELTA able to fully enjoy a game of say, Supreme Commander at full settings and unit caps (GASP), even if the other 3 devices are sucking all they can at data and DL'ing over torrents and shit?
I can be very short at this anwser...;
I don't have any experience with the WiFi-AP solo functionality on Intel boards, nor do i have any experience with Intel after the first generation PIV, but it would be safe to say that if you were to tax the WWiFi-AP on your mainboard as heavily as 300Mbit, wich i can hardly believe, since bandwith of WiFi networks has only very recently reached multiples of 100Mbit, i would guess that all that bandwith, combined with DHCP and DNS, at least if you were to run that on Î", would be enough work for both the WiFi network controller and the CPU (srsly, i don't know Intel, so i don't know if the CPU has to help the network controller, but it probably has since the WiFi-AP solo involves some software to be run in the operating system), to make your system notice that there are a couple of other devices on the network. This happens naturally, since the DHCP and DNS server will know what devices are present in any given network...
As for that impacting performance, i higly doubt that your games of Supreme Commander wil be affected that badly or, wich will be more likely, will not be affected at all...
So in conclusion, as long as you're not dl-ing with uTorrent and playing SupComm online, there's nothing to worry about...
(srsly, uTorrent is a huge bandwith hog)
I think it's not gonna hurt your computer, as long as you have RAM memory big enough. And, you already have multi-core CPU, you will be fine. (But if you're to DL'ing off uTorrent and play game on a single-core CPU, you're f***ed. But it won't happen, anyways.)
And, a bit off-topic, I have a picture of my own modified retail heatsink setup:
[attachment deleted by admin]
Well, I certainly hope so. ^^ That aside, SupCom IS ABLE TO USE ALL CORES ON THE PROCESSOR THOUGH so yeah, that resource hog might just take all that shit down by itself. Yum. -w-
Yes, i know...SupCom is actually programmed quite well, it even has multi-videocard support for a second monitor on wich you can display the tactical map...
And Back to the fan... Nice way to incorporate that fan on that block...;010
I only have two things (not bad, don't worry ^_^);
1: Is that an MSI board?
2: That XDR is the strangest memory i have ever seen... ^_^ (and i've seen Topless SIMMs)
Added after 5 minutes:
BTW, by reading some wikipedia (finally looking up XDR and then reading down to the other RAMs) i found this... (http://en.wikipedia.org/wiki/Z-RAM)
Yeah, it kept Athlon 64 nice and cold. And it's Gigabyte MA78GM-S2HP motherboard, and if you look below, you will spot Avermedia TV tuner card. And, sadly, XDR memory is in other of my computer (I will also take picture of its innard) - am staying with family for a while, when I go back and see my wife, I may be able to do so. (She still have majority of my shits at her house...)
Hmmm, sounds personal...
Anyways... meanwhile i can probably take a couple of pics of my current system and post them... (if i'm lucky i'm getting some money back from the tax agency, so i can get me that nice watercooling set i wanted... ^_^, will post pics of that too when i have it)
Yeah, and I am glad that I'm leaving soon, though... And, I am posting my desktop, showing my CPU's on-die temperature, with a modified heatsink you saw earlier, giving you an idea of what is Sunon 80mm turbo fan is doing to the heatpipe. And, yeah, I liked Nagomiko's work - one reason to put a wallpaper of a miko!
[attachment deleted by admin]
WTF?! 12 degrees?! WHAT?!
I don't get mine that cold, but i do get my CPU at roughly the same temp as the case (mobo)... see stats below...
And as promised, a photoshoot of my PC... (don't mind the mess, i hope it will dissapear when i install Liquid Cooling)
BTW, all pictures were taken with the system still running, the fans that appear to not be running are running, my camera is just so fast it captures them standing still... ^_^
Okay, here's the front of the PC, as you can see it's nice and airy, with a 120mm fan installed. I have another 120mm fan wich i will install under it, tomorrow... And yes that is a Garfield pluche, and yes that is GITS and LOTR you see there...
Next we have my case from an angle, taken without flash to show all the fancy lights... ^_^ BTW if you look closely, you can see that Garfield is sitting on that other 120mm fan...
And here we have my case from the side, showing that hueg fan... :D
Next we have a shot of the inside, showing the loosely organised chaos... ^_^
Here's my SLI setup (two ASUS GF8500GTs) and above that is my TV tuner card, wich is useless in Vista due to driver incompatibility...
And here's my 4GBs of DDR2-800 DRAM, i installed some heatsinks on them, since Corsair didn't do that with their ValueRAM...
Here's the ingenious solution of my case's fabricator to store the harddisks... ^_^ (above the PSU, btw)
But i have two more HDDs (and more to come ^_^), bte if you look carefully you can see that the bottom one is my PATA disk, connected with an (old, very old) rounded cable...
So that is it (plus the attachment ofcourse, showing the status of my PC), let's have the comments... :D
[attachment deleted by admin]
Freaking nice case! And, CPU temperature still can be corrected with much faster Silverstone fan, and I'm using 45 Watts Athlon 64 OEM CPU (I didn't care about OEM processor chip, let alone customing my computer - also, after such heavy usage, it stayed at 20 - 25 Celsius at full throttle. I used Arctic Silver 5, my favorite choice of thermal paste.) And, this case definitely remind me of my wife's custom system, that she built herself (with some of my helps.)
I'm also anime addict, it's my own drug. ^-^;
Added after 13 minutes:
And, eeehm... My case here isn't pretty at all, it used to be a part of office computer. It was of Datalink (DTK) computer brand, but I got it for free - a bank junked it. Best of all, it's made from Japan (assembled in China, but weight don't lie.) and totally customizable (that I can fit my new blue motherboard.) And, how did a CPU get cold? Simple, the fan intake's 1 - 2 inches away from the removable (side) case panel, and the bare metal side of panel just simply cool the air down before fan suck it.
You like? cool, that means all that effort hasn't gone to waste... :D
Oh, btw i forgot to get a pic in of the front with the brushed aluminium door closed... ^_^so here it is :
btw, that red thing on top of the case is a gel-based hourglass... ^_^
And i don't think i can get my CPU's temperature to get that much lower with the current type of cooling i use, since the ambient temp of the case itself is too high for that... But that will be solved when i get liquod cooling, because i will then have a double 120mm radiator outside the case, and one in the relatively cool front of the case... (or i will have to reverse the airflow, wich will mean i will have one at the relatively cool back, and the radiator on the processor- naah that CPU radiator will get cool air anyway from the 240mm fan at the side..., plus then the only thing wich will be cooled with "conventional" heatpipe and therefore producing heat in the case/around the MoBo, will be the northbridge... ( i plan on cooling the GPUs and RAM with liquid aswell... ^_^)
And as for the anime, yes, most stuff on TV here is crap anyway, so anime is a welcome change... :D
And for the Case; I have had many different cases in the past, from a 16U Poweredge, to big tower ProLiant, to boring beige, to simply black... this is the second "nice" case i've had and the first i actually love (i love the simple design and the way the mainboard is fitted)...
Wich is good 'cuz the darn thing cost me â,¬115,- without PSU, so yeah another (be it small) reasoon to make this my flagship... ^_^
Yeah, I don't have either CATV, nor ClearQAM here either, not yet... T-T (Yes, my TV tuner's capable of tuning into Digital TV channel, the only problem is, I made an antenna which is a shitty excuse for an ATSC tuner, which is in my computer.)
Plus, you can also try Peltier (TEC) block, if you think water-cooling's too expensive. You will want to condensation-proof your motherboard too. I have done it before, I'm using it on x86 Cell processor, really kept it subzero cold, even under intense load.
Added after 14 minutes:
Also, please be sure either your current heatpipe or a new one can stand a massive heat from peltier block, just to be sure. (I think you will be fine. Thermaltake's heatpipe can withstand this kind of punishment, and also heatpipe is highly recommended for peltier setup. Run peltier off 2 AA battery for a short time to find cold side to stick on CPU's heatslug, to cool it. Yellow wire of PSU: 12 Volts / Red wire: 5 Volts.) To make it f***ing cold, use yellow wire of PSU link to red wire of peltier block.
Added after 12 minutes:
P.S. Make sure your heatsink's clip can handle going much further without shattering the peltier block. And Neoprene foam (black poreless foam - please be careful around the sticker glue!) is the best (so they said...) to condensation-proof your motherboard. If your heatsink use screw-type clip, great! (because it always use springs beside the screw, to equalize the clip pressure.) I don't think you would go for electro-cryogenic cooling, if you do, awesome.
Well, from the cooling methods i have reviewed, i did like peltier, because it's way, way out the box thinking, but it is also near impossible to get by in holland, and damned expensive when you find one that actually sells it, plus peltier will need a cooling device to keep it's hot side cooled, so i will still need a liquid cooling system, or better... Peltiers are heat pumps, really flippin' good heat pumps, but nothing more than that... Chuck a peltier element on a cpu and expect it to do it's work on it's own and you'll fry the CPU and likely the peltier element too...
I have also taken a (brief) look at Liquid Submersion cooling, wich is also a pretty cool idea, as long as you use a dielectric fluid... but waterproofing (water in that term not to be taken literally ^_^) the case is too much of a hassle for me and i would end up draining the complete system far too often because i want to add/replace components...
Liquid Nitrogen/ Liquid Helium... yeah, extreme, and extremely cold (N goes to -196C and H goes to -269C) but also extremely impractical, expensive and lethal to the CPU, so i'll pass on that one... ^_^
Then there is Phase-Change cooling, wich is interesting and also extremely efficient ( i would dare say it rivals peltier, because peltier cooling is highly dependant on the type of cooling used with it), capable of turning the PC into a tiny piece of alaska... ^_^
Only drawback is (well, aside from the need to insulate every part of the cooling system, since the pipes will sweat, and of course that it can only cool one component per unit) the price, it cost twice the money of the quad radiator liquid cooling setup i have planned for the PC...
And well, Active aircooled heatsinks...: Yeah, they of course work for the components, and they work well, but i'm not aiming for "adequate", i'm aiming for complete "overkill" so that idea is out of the question...
Passive heatsinks: eehm, we don't live in the nineties anymore, and even though i have two videocards wich are still cooled by this method i doubt that i will ever see a practical passive heatsink for a quadcore Phenom, and i will doubt that it will be able to handle much excesses... so no, thanks...
No heatsinks: come on guys, srsly; I'm not running a bloody 386 here...
Heatpipe: yes nice and pleasing to the eye too, but i already have it and i want something a bit better... ^_^
Leaving me to settle with watercooling...
And maybe, just maybe i will give Phase-Change a shot, but that will be later on, way later on... ^_^
Oops, ranting again... ^_^
I agree on that. But I still stick to Peltier for a powerful CPU, anyways. And it will still have the same problem, the board around the CPU will just sweat as well. ._.
Plus, the surplus in both Japan and America has, pretty much, plentiful of those little b*****d that I can just put my hands on it, and buy 'em stupid cheap. (I'm gonna warn you, my Jap friends, the surplus here that do get the peltier from scrap, are pretty hard to find. If you have found 'em, wonderful! -
Added after 1 hours 37 minutes:
However, if you Japanese, found me wrong, please correct me here... *-" I sometimes don't remember real well. Thanks. *bows*)
And I love Japan so much, so much beautiful places and fields! (reminiscing a wonderful evening at a temple)
Oh, sorry if I'm derailing off topic...
Well, phase-change coolers come with all the insulation and padding needed (at least those from asetec do, for â,¬694,- they'd better have that stuff included...)...
But i think that when i do get my hands on one of those VapoChill units, a peltier heatpump might actually be a good idea to stick it in between the CPU and the cooler...
Added after 10 minutes:
Btw, i have found a relevant cooling block... ;010
That will just work fine.
And, by the way, the computer case temperature is also affected by how cold / warm is the room's surrounding air. It doesn't affect water cooler by a large percentage. It does to a regular heatsink, though.
Yep, that's why i want to go beyond air cooling, because in the summer those fans would just be turbocharging hot air into my pc... (my room gets all the sun, and it's under the roof, wich makes it nice and toasty in the summer)...
And of course, should i get my hands on 4 crossfireX cards, watercooling would prevent vacuum problems... ^_^
Yup. And, I don't think I may be having that much problem with air-cooling, yet... If I get to move back to Japan, well, it will get real toasty in the summer. Here in Montana, sort of (Helena was actually a desert, at least I heard about it.) - oh, and if anyone not knowing about Montana's weather, it's definitely strange: Hate it? Wait ten minutes. Poof! Now you're satisified with the weather now.
Oh, we have weird weather here too... Back in Hardenberg, (the shitho- my birthtown) if it was raining on one side of the river, it could be sunny and warm on the other... and that river was just 25 meters wide...
Yep, that was what I heard.
Okay, i know i'm not supposed to ask that here in this thread, and i also know that i am once again being the king of OT by doing this, but i do so have to know...
How did you hear that??
(man, i am looking so surprised now, my eyes are starting to hurt)
Okay, here goes: I have heard that from few peoples and also fooled around on Internet (Good old Wikipedia and Google!)
Also, ever wonder what will happen if the fan on the heatpipe just quit working?
(It just have happened to me - it shot to 63 Celsius! I had to straighten the metal strips in the fan plug, it's working fine now, and I'm watching the on-die temperature, just to be sure.)
Added after 4 hours 44 minutes:
Got it right back on track. I burnt it in, by playing a test video file (similar to Blu-ray Disc's M2TS) on VLC media player (with multi-core usage feature on), for a short time - it got nowhere near 23 Celsius, even in full throttle. Whew, almost fried it! Luckily, it used Germany Army Specification Hi-Temp silicon, otherwise I will have to buy another CPU. (Seemingly, AMD have made the right decision using this kind of material right in Germany. Japan's better, but still better than nothing.)
Outch! man, that reminds me of the time i got to service williams PC after they had brought it in for servicing with their regular PC-guy for some ten times...
Turned out all he did was to reinstall Windows (not everytime a PC is running unstable is Windows' fault), so when i got it, the first thing i did is what i always do; Open the case up, and i saw dust... Lots of dust, and a dust cake about 5mm thick between the CPUs heatsink and it's fan... so i knew what the problem was... Unfortunately when i removed the heatsink i saw a perfect impression of the CPU burned on the bottom of the CPU... Long live thunderbirds lack of thermal protection...
Wow-whee! That's no wonder it's named Thunderbird. ;014
Luckily, AMD etched thermistor on-die inside their 64-bit CPUs, that is, if it gets so hot it shut itself down. But just blame on motherboard if you got a fried 64-bit CPU.
Also I noticed that as I corrected the strips in the fan power plug, the retail heatpipe cooled down quickly as the fan is running - now I'm wondering, do it contain R134a refrigerant or ammonia gas for the extent of cooling??
well, heatpipes are usually filled with a cheap liquid, like water as far as i've heard... but then again, who knows what stuff they put in there...
I guess AMD may have known better not to put water in there, providing the retail heatpipe (that came with Phenom CPU) have a uncanny fast recovery - water don't do that.
I suspect there may be one of three things in AMD's latest heatpipe: Alcohol, R134a refrigerant, or Ammonia gas.
Well, then i guess it would be alcohol, since R134a would be a bit expensive, ammonia would be a bit dangerous.... And, well:
Quote from: "Wikipedia"Some example fluids are water, ethanol, acetone, sodium, or mercury.
Quote doubles as link
BTW mercury sounds a bit dangerous to, as far as i'm concerned...
Yeah. And, Mercury is only intended for use in a hermitically sealed computer, like a part of upgrade package in Hubble Space Telescope, for example. Also, AMD can choose to use R134a refrigerant if they want. I think it is very likely to already have R134a inside while taking Phenom's TDP into account.
Yeah, but heatpipes work because a liquid evaporates because of the heat produced by the (in this case) CPU and then moves to the cooler part of the heatpipe where it condensates and moves back to the hot side... i don't think a substance wich is already a gas in room temperatures will do that very efficiently...
But i will have to look up asetec's new phase change cooling solution to get a definate anwser on that...
Added after 8 minutes:
Well, that search turned out inconclusive due to lack of practical or technical information about their newest cooler...
Well, at certain pressure, R134a can become liquid with just a set of radiator fins and a fan. But for a wick, it has to be made of ceramic fiber, which is made hollow - in order to suction the liquified R134a refrigerant back to CPU block, to be vaporized. It doesn't have to be a passive heatpipe, like VapoChill Micro.
Okay, i have sent an inquiry to AMD with the request that they send the technical data on the substance used in their boxed heatpipe coolers shipped with their CPUs...
I really want to know, I've been watching too much House, probably, but i want a clear anwser instead of guesswork...
Yeah, I agree. Still, definitely not water, though.
Oh, no... I guess it's alcohol, since it evaporates at ~70c, making it a quick responding fluid and it's cheap to come by...
I thought so. Also, you found that people say that XDR-DRAM is never good enough, right? For me, I disagree (because I have used it. Sure, I admit it's hard to program the XMC to get the main memory activated, but it's still similar to what's in few of AMD's programming manual.) So, what do you think about that?
Well, although i find the name a bit uninspired... I do think it's the logical next step in RAM evolution, and i am also glad that the company Rambus isn't just a part of intel... wich i thought because i only saw RDRAM in intel machines...
I agree Dr. Mario, although activating the main memory is QUITE a pain in the ass......
True. It's a pain in the ass - you (will) have the garden-vareity of XDR-DRAM memory, which you must program the CPU to collect the SPD data, the clock the pulse trains (just like what you would do away with DDR memories.) then map it after. But pains and tears will just be worth it, you will have an access to much faster main memory in either your homemade OS or Windows (with the Rambus XMC driver that you either made or download off the CPU maker's webpage.)
That's fine, Smokey.
Added after 10 minutes:
Not much peoples find it uninspired. And it's indeed a next step, XDR's next to be used by new computers. Also, some peoples said it's slow - they're wrong, I have actually used this kind of RAM, directly linked to a x86 CPU (this processor's still pretty new) AND ran SISandra software's RAM bandwidth test, it went off the chart (being #1 Better in this test)!
Well, i'm just used to the X standing for eXtended, like the A stands for Advanced...
But then again, i do still linger in those olden days of Computer technology...
Besides, XDR will eventually be surpassed by a faster technology, so what do we have to call that, because extreme is usually followed by ultimate, wich has no follow-up...
So all in all, i do hope to see XDRAM on AMD boards in the future, (just not the very near future, since i'm planning to buy a DDR2/3 board in the very near future)...
Man, looking at that MoBo of mine just gives me that feeling i want to mess around with lots of cheap hardware again and build something nice out of it...
Added after 1 minutes:
And with something nice, i actually mean a beatifully crafted, insanely complex to use computer wich has far to much power for it's age... ^_^
Very true. At least few companies are working on it. (ADR-DRAM, maybe?) And I think future GPGPU will use it first before any bebemoth CPU, or even a 128-bit x86 CPU will use 'em. (I know they already made the silicon chip, very much with High-K Metal Gate transistor process - they're pretty much test-driving it to see what falls apart first and make a list to either fix and improve - before middle 2010.)
Wait, why would we need 128-bit computer technology...?
Quote from: "Wikipedia"...64-bit architecture effectively increases the memory ceiling to 264 addresses, equivalent to approximately 17.2 billion gigabytes, 16.8 million terabytes, or 16 exabytes of RAM.
So, how interesting 128-bit CPU's are, and how much i do want one when they get here, i really don't see the need, because i don't see RAM technology evolving so quickly that it will give us 1EB memory modules...
And to achieve such vast quantities of RAM with current technology (i am using 4GB modules for this calculation) we would need 4.300.000.000 DIMMs... I don't really see the Motherboard real-estate for that, nor can i imagine any program or process wich will need such memory space...
But like i said, i would so totally want one if they come, just for the coolness of it...
Yeah, I get a picture now. Although it's still possible for the x86 CPU to evolve into 128-bit one, but will still have 64-bit physical address, anyways. And I knew about it and experimented with Extended Mode (128-bit mode) and ASM instructions are pretty much the same:
example: AL - AX - EAX - RAX - CAX
Oh, we should be developing it, of course.... and it helps that the instructions don't differ that much...
We will reach the limit of 64-bit soon enough, just not the next 5 years or so... ^_^
We have a saying in holland justifying the 128-bit research and development: standing still is going backwards...
True. And the development on a 128-bit RISC engine for a x86-128 processor already have begun almost two years ago. And AMD Phenom is a perfect example of that - its FPU contains three 128-bit ALUs each cores.
?? Are there such great numbers to be crunched today that it requires a 128 bit ALU?
Well, it's simply because Athlon 64 cannot catch up with the newest DX10 (and DX11) games and some of video files (like the ones off Blu-ray Disc) without using too many watts in one sitting. They're freaking math-intensive nowaday, that sometimes a 64-bit ALU inside the FPU circuitry can be never enough, so - as a part of R&D effort - they put few 128-bit ALUs in Phenom's modified FPU, to see what's really rolling.
Quote from: "Dr. Mario"...Athlon 64 cannot catch up with the newest DX10 (and DX11) games and some of video files (like the ones off Blu-ray Disc) without using too many watts in one sitting.
Quote from: "Dr. Mario"...Athlon 64 cannot catch up...
Quote from: "Dr. Mario"...cannot catch up...
Then Intel better be hurting under those processes too.... Or i'll make em hurt...
Sorry, but it pains me to read such things as a die-hard AMD fan...
That was why AMD had to introduce Phenom CPU (although, thank to memory controller / Level 3 bug that some AMD engineers found, never succeeded. Phenom II did.) It's because AMD wanted to show Intel that it do have a liver to whack Intel out of their game, by giving everyone a taste of what a silicon-based speed demon should be. After all, I'm still an AMD fan. I use that Germanic gem daily. (If you look at the CPU heatslug, you will see that it's diffused in Germany.)
Good ol' German Quality... :D
BTW, i just read about how BTX (the form factor) failed (http://en.wikipedia.org/wiki/BTX)... and in there it said that
QuoteTo date, AMD has offered few BTX product options and has emerged as major and viable player in the computer industry
Wich does sound logical, since AMD has bought up ATI and with that grew a great deal bigger...
So i shouldn't be too worried about AMD after all...
Besides AMD will probably live anyhow, if only to prevent intel from gaining monopoly(although it would please me more to see Intel reduced to that fate )...
Yup. AMD might be around for a long time.
Since AMD have signed the contract for XDR memory (along with PCIe and several others.) and AMD might let the leash of Phenom II's hidden extra horsepower once they get PC XDIMM together, and although long postponed, it will be very soon. Socket AM3 is definitely the one of the other that have XMC circuitry. Already looked at a picture of Deneb die. Looked at the RAM controller very carefully - it's a opened-paged multimode XIO, pretty much capable of running DDR.
Well, i knew AM3, along with Phenom could handle DDR3 aswell as DD2, wich is one of the reasons i love that platform (i can buy the MoBo, and invest in Processor and RAM later ^_^), and if i remember correctly you said it could handle XDR, so that would mean Phenom is a real do-it-all... ^_^
Phenom's built-in XDR-DRAM support is all a part of stragetry - pretty much a surprise attack, and it's scaring Intel right now, Because: 1. a 4.0GHz XDR can easily outperform a GDDR5 of the same frequency. 2. It has support for DDR-II/III memory and CAN be used on an AM2+ motherboard (it already have a XDR Clock Generator on-die, for compatiblity reason.) 3. XIO controller isn't too pricey as they thought, Phenom II X4 is only $180. 4. It could just simply kills Core i7 when its FSB throughput gets big.
Ah, so the empire of AMD strikes back... good to see, because they were losing ground to Intel lately...
Quote from: "Dr. Mario"Phenom II X4 is only $180.
Wait, wut?! a Phenom II x4 810 (that Deneb with 4x 2600MHz) cost â,¬179,- here, that is $236,10! Man i neet to find a way to get my hardware from the states... :/
Yep. It often is weird when it comes to local currency. Also newegg.com sell 'em dirt cheap, even item of Japanese workship quality (which are usually expensive.) - I have had no problem with 'em.
Although you're not in USA, I would say it's pretty much out of your reach, but asking never hurts, anyways.
Meh, even if i don't find a site wich ships overseas, i can always go on vacation to the US and pick the stuff up... ^_^
BTW, newegg does rock... gotta get in the airforce and get sent to the US for a year or so, that way i can stock up on hardware... :D
True. Newegg is the best.
Well, you can ask Newegg if they can ship them to your home country. If they can't, blame on trade regulation. I sometimes hate the import regulation: The importing license is so frigging expensive.
They can't, wich is too bad, since they have some really nice stuff...
(they are lacking a bit with watercooling though... ^_^)
Bummer. That kind of sucked.
Meh i have a feeling that i will go to the US sometime... i also have a feeling that i may be able to earn a few bucks off my colleagues when i do... ^_^
If i can have them save half the money i do they'll be happy, and i'll be happy, because that other half will be in my pocket... :D Glad it isn't illegal to buy lots of stuff abroad and sell them here... just a bummer that the taxman'll want a cut when i do make some real cash off of it (you can earn an extra â,¬500 a month tax free here)
Also, completely unrelated to you, but related to me and my keyboard layout; What does you computer produce on screen when you [ctrl] + [alt] + ?
On my PC, it gives a nice â,¬, discovered the wonders of [ctrl] + [alt] + [key]-ing when [ctrl] + [alt] +  gave me a Â¼ and i decided to see what happens if i were to do that with the other keys... ^_^
Uh, what OS are you using when you did the key combo? I'm using 64-bit Windows XP (Cuz I like seeing my own Athlon entering Long Mode.) - did the same - nothing.
Oh, and be careful of tariff law - it's much stricter here in USA and in Japan (They share exactly the same trading law.)
AND! Annnd... (panting) I'm planning to build a nice high voltage power supply to power any gas laser I can touch - it involve the use of computer technology in form of Microchip PIC18F to switch the IGBT bridges on/off -
Added after 12 minutes:
- to give me the voltage I really need - in order of few hundreds to few thousands of Volts AC, just by adjust the PWM timing by the use of encoder knob. I'm also using a old DELL power supply (from a Pentium 4 Northwood system) to feed a transformer. The transformer came from a dead inverter-powered microwave oven and I rewounded it to my specification. I used a thick speaker wire to obtain high Voltage current of few Amps, just to power a Carbon Dioxide laser. And the transformer is of a flyback type.
BTW, i am using Vista ultimate x64, with the locale set to holland and keyboard set to US international... i guess that "â,¬" is specific to europe then... but you don't get any characters when [ctrl] + [alt] ing keys?
And as for the tariffs on exporting.... nice and complicated, so i'll probably just end up buying some stuff just for me... ^_^
Oh, Cool, Lasers!... I always wanted to build a high powered laser, but i wasn't allowed to turn my room into a lab.... I'd have built a tesla coil too if i could...
Yeah. I'm having a mix of laser and tesla coil. The transformer is pretty much a tesla coil on steriod - I wound a long, thick speaker wire for a secondary winding to give me few amps of 10,000 Volts AC. I also am going to use a PIC18F to inject 28 KiloHertzs pulse to make the speaker wire being wound on the transformer pour out enough power to light up nearly 90 of 100 Watts / 125 Volts light bulbs connected serially, so yeah, I'm talking about serious power here. I may be operating either CO2 or -
Added after 10 minutes:
- metal vapor laser, they requires LOTS of Amps, much like the power pole, only DC. (It will be rectified once the transformer spit out high Voltage AC current).
Why PIC18? Safety first. And being able to tune AC frequency, in turn, changing Voltage it's producing.
My wife wouldn't mind too much if I would attempt to turn my room into a lab. (I usually do away in a tool barn.)
Heh, well i guess i cant set up such a lab, because of 1. the power drainage and 2. they think i might kill someone or destroy the house if i do (i have a tendency to go overboard on things...and holes in walls isn't such a nice thing ^_^)
Added after 32 seconds:
was also once thinking of the feasability of a home built particle accelerator...
Yeah, I agree - at least you do know that working with that kind of electronics do require knowledge and guts to make it out alive and making it work.
And, I'm using fuse, circuit breaker chips, and relay switch, for to stop supplying power when it go wrong.
Plus, I'm using a old computer power supply, mainly for safety reason. 20 Amps at 12 Volts DC will be enough for me to operate the transformer off the H-bridge IGBT array, but I figure it's better to use a 5 Amps / 250 Volts slow-blow fuse, to -
Added after 12 minutes:
- prevent the primary winding from being cooked when the software on PIC18F just simply won't cooperate with PWM timer or the thicker secondary winding is hooked up to a heavy load, i.e. 90 100-W light bulb or Copper vapor laser - since the primary winding is consist of a strand of 20 AWG wire from a DIY Ethernet wire.
Also, I have worked with X-ray laser before - and it's way nastier than either Infrared or Ultra-Violet laser, leave alone those super-short wavelength of light invisible and dangerous.
Well, x-ray's do produce radiation, no matter what device they're in...
Yes. Although, on one hand, not always - because some material, like Lead, i.e. solder, absorb it entirely. Air does too, producing faint blue light and rather strong Ozone. On other hand, the incident radiation (i.e. X-ray laser radiation) striking some metal like Copper would produce Secondary Radiation, either white X-ray light (like a light bulb) or X-ray laser radiation (of different wavelength).
Although X-ray laser is dangerous, it's a bit safer than ramping an old power Triode, because the laser -
Added after 11 minutes:
- beam stay confined in one spot, that is if it's well focused, until you tweak with its line of path with mirror.
And I was interested in building a X-ray laser, because then I would observe what a regular old 1N4001 diode do when I shine it on, and also being able to see a pattern of atoms (with severed face of a dead CRT while shine laser on few different kind of metal) and even see how fast it drill a hole in Lead brick. It was a bit (sort of) simple - it's filled with Nickel vapor, mixed with Xenon.
Added after 11 minutes:
Why Xenon and Nickel? First, Nickel was and is an excellent material to emit X-ray and, also is one of another lasing medium. Second, Xenon gas is used to heat Nickel up so much it heats up and vaporize to a point the current starts to flow through it. And Xenon also cools Nickel vapor down a bit once transformer switches off, making it ready to fire another rounds of X-ray laser emission as the transformer turns back on.
Well, i have to brush up on my lasering again, and maybe i can build a small one without anyone noticing )that is untill my sis wakes up with a laserbeam above her feet :P)
Could be sweet though, if i could power it with a (wind)generator i designed... ^_^
Try going with laser diode - it's fairly easier to handle, except that it's sensitive to static electricity.
If you do, all you havd to do is to put len in the module and focus it until laser dot's the same diameter as when it's being emitted out of the module.
Oh, and if Microsoft said Windows 7 is based on Vista, then its core is pretty much still a part of Windows Longhorn (Vista's actually a part of Longhorn family.) - a picture of "Windows Vista Beta 1" Vistans came to my mind.
Added after 26 minutes:
This is what reminded me that Vista was and is a part of Longhorn family. I know it's a loli, but I didn't care much, I like to collect the pictures of OS-chan.
[attachment deleted by admin]
Heh, i just found a page where they explained how their 5-10 watt COÂ² Laser was built... They also explained that 5-10watts of COÂ² laser can singe wood and melt plastic, aswell as burn flesh/skin and destroy eyes... Makes me wonder what a 50-100watt COÂ² laser can do ;006
But i will most likely be building a solid-state laser, because gas-powered lasers are too complex to build as a side hobby...
Entirely doable. You can use three different kind of pumping lamps to pump the laser crystal: LED, laser diode, or Xenon flash lamp.
Oh, and about a 100 Watts CO2 laser, you could really flash-melt the metal brick, just much in a way you would want to cut a stick of frozen with a hot knife. I have a Nickel-Xenon X-ray gas laser and have flash-melted some brick of metal.
Although, I had much success in building a X-ray metal vapor laser, there are two things you should emphasize the same with CO2 laser:
Added after 15 minutes:
Power consumption and cooling. Although I admit that my own homebuilt laser do consume lots of Amps, CO2 laser do eat lot of Amps, in order to hammer the Carbon atoms to be emitting Infrared, in order of 10,600 nanometer. On other hand, my laser actually consumed 4 to 12 Amps at 14 KiloVolts, just enough to jostle Nickel atoms hard enough to be emitting 2.2 nanometer X-ray laser light (over 100 Watts if I bump the Voltage a bit, but it will only serve to overheat both load transformer and transistors.)
Well, i've found a nice Solid State Laser setup i can build, only question now is where the heck do i find a Ruby or (better) Neodymium YAG rod??
Try eBay - www.ebay.com - it's your best bet, and, yes they ships oversea (depending on the sellers - most do ship oversea, some do not. Be careful of the fake items, review the seller's history first.)
Kewl, now the length of the rod isn't that important, it's the power the Xenon flaslamp produces, right?
And of course the quality of the mirrors... ^_^
Yep. You got it all right. One trick to get a freaking bright flash for effecient usage of the voltage stored in the high voltage capacitor is to use a horseshoe flashlamp. You can do away with a disposable camera's gut, BE CAREFUL - don't touch the pins of capacitor and the lamp, remove the battery, then short it (if you must, wear the ear protection.) Then desolder the original lamp then use Xenon horseshoe lamp. (Middle wire go to a tiny ignition transformer smaller than the one charging the capacitor.)
Come to think of it, i can also experiment with different ways of bombarding the rod with photons...
I could line the entire cylinder housing the rod with ultra-bright LEDs shining directly on the rod.. Sure i will lose lots of photons this way, but some 50-100 LEDs produce one heck of a lot of light nowadays... Osram has even made one that can supply 200 lumens per LED (http://www.physorg.com/news4538.html) ;006, but those will probably be insanely expensive, i did, however, manage to find high-output LEDs for a reasonable price, only they are red...Wich makes me wonder if i do go for LED-bombarding, does the color of the light make any difference?
Or, i could buy me a 1500W halogen light rod... :D wich would make construction a lot easier...
Hmm, or to go completely overboardget two 1500W halogen bars, or a rig totalling 3KW, cuz our fuse-boxes here don't go any higher than (220V x 16A) 3520W and it would be fun to be able to sustain a beam long enough to see how effective/destructive it is...
And in case someone doubts the power of halogen (or just for the lulz):
Well, you can lit a laser rod with a halogen bulb, but I don't recommend that - because, for even longer life, the rod like to be in cool area (because of radioactive nature of one particular mineral, a lasant material Neodynium. Too hot, it won't lase, could even blow up.)
And to take the headache out of choosing the LEDs, I'm going to help you pick out the right ones for the pump lamp for particular laser rod's ingerident, ensuring your success:
Nd:YAG / Nd:YVO - 808nm IR LED
Ruby - 590/575nm Yellow LED
Darn... cuz sticking one or two halogen bars in a tube is a lot easier than customizing a reflective tube to fit lots of LEDs...
Oh, well...better i find it out now, than that my laser blows up and kills me... ^_^
Oh well, I just like you, and the others who read this forum, to be safe. ;001
I have experienced having the cracks in a Nd:YAG rod due to overheat (from being near the halogen lamp rod). ;020
I've had that problem once.
What?! Did the same thing happened to you..! I'm just surprised because exploding Nd:YAG rod is so lethal, that their sharpnels are surgically sharp and will stay hot for up to 20 minute due to a radioactive rare-earth metal Neodynium (same metal used in a rare-earth magnet). If your situation wasn't the same as I had, tell me, Vegapunk - I'm curious still.
So, Smokey - have you tried LED for laser rod pumping?
I'm more curious about your laser, if it was successful or not.
Shit, I should try and scour the catalog for my NES project... My hard drive got shot, so it will be a while. T^T
Heh, not yet... i was "smart"enough to visit these guys (http://buildalaser.net/index.htm) who told me that most rods found on eBay are already depleted and thus make for poor laser equipment... But they sell new ones, only a bit too expensive for me to work on at this very moment... ;063
Plus i am unable to determine if they even ship to europe...
That's definitely what I should watch. I heard nothing of depleted Nd:YAG rods. Here's why: Neodynium have an outrageously long decay time, having a Half-life of up to 700+ years. And it's the active ingerdient of the rod's formula.
I would just safely say it's depleted because its former owner is simply a baka, or didn't give a cooling system enough attention.
Like what I suggested, review the eBay seller's history.
(I'm not being mean, but you sometimes have to balance the facts from bullsh*t.)
QuoteThat's definitely what I should watch. I heard nothing of depleted Nd:YAG rods. Here's why: Neodynium have an outrageously long decay time, having a Half-life of up to 700+ years. And it's the active ingerdient of the rod's formula.
There is an error in your calculations... -_-
Quote from: "Wikipedia"Naturally occurring neodymium (Nd) is composed of 5 stable isotopes, 142Nd, 143Nd, 145Nd, 146Nd and 148Nd, with 142Nd being the most abundant (27.2% natural abundance), and 2 radioisotopes, 144Nd and 150Nd. In all, 31 radioisotopes of Neodymium have been characterized up to now, with the most stable being naturally occurring isotopes 144Nd (alpha decay, a half-life (TÂ½) of 2.29Ã--10^15 years) and 150Nd (double beta decay, TÂ½ of 7Ã--10^18 years). All of the remaining radioactive isotopes have half-lives that are less than 11 days, and the majority of these have half-lives that are less than 70 seconds.
So the somewhat stable radioisotopes of Nd have halflifes of what like, 1k and 1m TRILLION years...
That said, it's sure quite probable that someone's just screwed up and killed them off by clumsiness.
Well, you made your point, NejinOniwa. I didn't know about it until I found out its actual half-life value.
I knew it had to be one of Nd Isotope. Not sure if it was Nd:144 in the formula... Definitely has to last much longer than a few week, otherwise it's useless for the usage as a lasant.
Depleted Nd:YAG (by normal usage)? Very unlikely. I never have heard of that.
Although I knew overheating issue's very effective in killing the laser rod, like this one.
Ah, well... i am a bit spoiled by the dutch laws stating that such false statements are misleading and thus illegal...
Buuuuut that would make my search a lot cheaper... (and a little bit less hard ^_^)
I also found ND:YAG Slabs (awfully expensive, though) what's the added value of those... Cuz i think those mean spending more cash for more work and getting a couple of rods, if you don't break 'em while cutting the slab...
That stuff (if perfectly polished around the ends and to be used as a laser itself) would make a hole far away! If supposedly the slab is one foot and has nice pink tint, would make 50 MW to 2 GW pulsed light, depending on how chilled it is, and what kind of lamp lighting it up like a christmas tree.
Crap, i can't find the right ebay... What ebay did you send me to?
What error did you encounter when you tried www.ebay.com site?
not an error, but more the fact that i can't find the items i previously found...
Added after 2 minutes:
Found it... ^_^
Eehm... Is your cookie in the Internet Explorer (or Firefox) enabled? I don't know...! T_T
Anyways, saving a HTML of favorite item do help also. (Remember to rename the saved page with short words or name. Otherwise, IE will fail to save that.) I always do that, though.
ah, well i am currently on my laptop, i keep that page open in chrome on my pc... also found a slab wich is slightly larger (~1cm)... :D
Added after 2 minutes:
And WAUW!!! Ruby laser rods are a lot cheaper!!! ;010
Ah. Goody! (sounding like an Italian chef) Mamma-Mia, that-a gonn-a be good!
Ah, yes... i do believe that size matters... ^_^
Yea. You will want to wear Welder's faceshield. Any higher (The bigger the Nd:YAG crystal the higher the optical wattage) and the laser's pointing to the light metal, you will want to wear the Lead apron.
In that case (if the lead apron is to stop the radiation) i think i'll go for a nice NBC Suit paired with welding goggles... always wanted one of those, now i finally have an excuse to scare the living crap out of the neighborhod... :D
Which, apparently, now you do.
The lighter metal, like Aluminum (like the one used for the food foil) would basically emit powerful X-ray radiation, with the optical brightness being few times brighter than the ones used by CT, when the Infrared laser disintegrates it.
Firebrick or darker-color hi-temp Ceramic are best choice for laser-stopper target.
Just to let you know.
Thank god i haven't finished yet then... I was planning on stopping the beam with a brick of lead...
Yes, you can use that, but if you insist, try use Lead-Tungstate alloy brick (I have that, primarily for stopping X-ray laser radiation, in order of few watts to 5kW.)
Firebrick always have a small amount of Lead, naturally occuring in few of the portion rocks in the ingerident.
Well, i don't have to worry about stopping any laser rays for a while... i blew the power supply and the fuse in the breaker box, so the project is on hold for a while... :(
How did you blow it up?
Well, i have a lot of paperclips lying around and the power supply didn't exactly have a cover, combine that with a cat who jumps on my PC and thereby tossing some paperclips down and ZAP crackle; gone Powersupply and power to half of the first floor...
Well, what kind of lamp is providing the rod the pumping emission?
And, I'm curious also, what kind of laser rod are you using?
Well i have me a nice ruby rod now (~10CM) and i was setting up a host of LEDs to light the sucker up ^_^ , but i can't say if it will work or not, since i can't supply it with power now, plus i have to convince my sister all over again that it's ssafe and that this was a fluke...
You're lucky, though - that you still can run it off the batteries.
Did you put in the fan to cool the LEDs and the ruby rod?
Wait, fan? it needs forced air cooling?? Good thing you mention it, and that i haven't yet bombarded the rod to bits, that would be a shame...
Wow, normally i am one to cool even things that needn't be cooled... and now i am working with something expensive i forget it... :/
Yeah. I'm just being safe. And, the high-power LED do get very warm too, when it's trying to spew out lot of light at the laser rod. And, the fan also provide the safeguard cooling, just in case the laser rod gets hot in operation.
Also, I would like to see the picture of your setup. (be careful of camera flash, though. If you have to use it, just use with the beam stopper brick, in case the laser fires with the camera flash.)
Or i can use my big flash, then i can move the flash head to the left or right, so it doesn't directly illuminates the rod...
But i will get a pic up here when i get a chance...(i'll have to clear the desk and set it up again... ^_^)
It would be nice. Just take your time.
Meanwhile, a thought crossed my mind when i talked to my sis tonight about super-solids and super-liquids; Lasers can also be used to cool something to near 0K (http://en.wikipedia.org/wiki/0_Kelvin)...
All iyou need are a bunch of finely calibrated (powerful) lasers...
Sorry I didn't have any time to stop and smell the roses. The laser-cooling method is interesting. And did you get it going yet?
no, i haven't even had time to set the laser up to take some pics, let alone scrounge up a new power supply...
If you have an old computer power supply that's not been in use for a while and is still working, you probably may be able to use it. If you do, I may give you a list of wire color and its functions, just in case you don't know about it. Should be a good refresher course, though.:
Green: On (short it to ground to turn it on.)
Black: Ground (In this case it would be "-" Cathode.)
Yellow: +12 Volts DC
Red: +5 Volts DC
Orange: +3.3 Volts DC
For me those are new PSUs, "old" PSUs are AT PSUs wich have a built in power switch... ^_^ but that is a good idea, although computer PSUs arent as stable as LAB PSUs...
Apparently, no power supplies are perfect. Except, of course, those having rather fatty capacitors for DC power filtering. However, I have had no problem running tons of shits off a harvested DELL computer power supply, but it did killed a Infrared VCSEL diode. LEDs, on other hand, are very forgiving when it comes to ESD (except for the Gallium Nitride LEDs, green [510 - 490nm] to Near - X-ray [180 - 48nm], they're very sensitive to ESD). Yellow LEDs, paired with the resistors will be fine on this PSU.
Ah, then it should be fine... or i can always toss in an extra circuit with a couple of filtering capacitors...
Meanwhile i had a braiwave, if a laser rod lases intenser with more photons being pumped in, how would a rod being pumped by a dozen laser-pens behave, since laser-pens already produce a coherent beam of amplified light... ;006
Well, it won't work, because the ruby rod absorbs yellow light better than red light.
BUT, if you put two ruby rod, both illuminated by yellow LED array, then one ruby rod would be properly labeled a pump laser "lamp", and other a laser emission intensity amplifier.
Both rods behind eachother or one aiming at the other?
Also while watching "Build it bigger: rebuilding greensburg" on DiscoChan i thought of using a parabolic trough to pump a laser rod... We have a nice hot sun this time of year, and i was concepting a sort of small scale hydro-thermodynamic generator...
[attachment deleted by admin]
Place two laser rods' ends facing each other, so that way the first rod pumps the other rod.
BTW, parabolic mirror will work. The only problem is, how do you cool the steam? Under the right condition, it can hit few hundred degrees, unless you have a heat exchanger of a sort. Pressure is also one of another corcerns.
And, no - it's unsafe to pump the laser rod unless there are freezing cold air blowing across it. Heat will simply stress the laser rod until it either explodes or cracks.
Added after 13 minutes:
But still, you would have another shot at it still. Use the best leaded glass and Yellow borosilicate glass filter, cooled by the rear-projection TV CRT coolant, to reject heat-generating Infrared radiation, leaving you only with blinding bright yellow light for to pump the ruby rod without frying it in the process. Be careful, though, that not all ruby rod are approved for CW function (that is, being a pulsed-mode laser rod.)
Yeah i have thought of that, and the generator setup would need a similar heat exchanger as an ordinary PC watercooling system, only bigger...; Pumping cool water through a tube housing the condensor and cool that with a radiator... Or use a two-stage cooling system, in wich th primary loop cools the generator loop and the secondary loop cools the primary one resulting in the radiator(s) having to cool less hot water...
And i tend to forget that laser rods have the nasty tendency to bblow when becoming too hot, wich makes building an extremely powerful one a bit difficult since liquid nitrogen cooling systems are a bit hard to come by for average people like me... ^_^
Anyways, if i were to get that generator system working and tweaked nice enough to produce some nice wattage i could cut back on the energy bill a bit... This works a bit better than a (home-brewn) windmill, because catching wind with a mill small enough to avoid building code and pissed neighbors is nearly impossible, and a sun-boiler could produce enough pressure (if the system is built right) to drive multiple (or one whopping big) turbine... ^_^
That's alright, people do forget. At least I reminded you of laser rod's nasty nature.
And, also back to the computer topic, I bought Western Digital 500GB Caviar Blue (formerly Caviar SE16) to replace a messed-up secondry drive. Now I got enough room for editing Blu-ray Disc files (M2TS) - I usually like to do video authoring. And whoever would like to recommend the best HDD recovery software (Freeware is appreciated too) so I could recover my files off a messed-up HDD. Tried PC Inspector, took so long.
RIPLinux (http://en.wikipedia.org/wiki/Recovery_Is_Possible) maybe... i also read at the bottom of this page (http://geeksaresexy.blogspot.com/2005/12/hard-drive-recovery-utilities-when-you.html) that there is a guide to HDD recovery with linux here, (http://www.shockfamily.net/cedric/knoppix/) wich is supposed to be very complete and informative... (looks like it's written for n00bs (http://en.wikipedia.org/wiki/FNG_syndrome) and the PC ignorants (http://en.wikipedia.org/wiki/N00b)... ^_^
Well, What I'm having is a hard drive filled with broken FAT32 clusters. And I was wondering about Linux, because SANEFS (EXT3) is their standard, and it's likely to complain about the broken cluster, and I don't have any room for the HDD anymore, so I'm using a USB microcontroller to get it read by the computer. (That and I'm worried about the Linux reading the maligned hard drive via USB.) Will try it anyways.
Oh a lot of linuses nowadays are pretty easygoing when it comes to reading FAT16/32 or NTFS..
Almost there! But the TOC is already screwed up. The files are left intact, although are orphaned. I need a way to bypass having to use a TOC, since the boot sector is messed up.
At least I have some rewritable DVDs, that I have thrown in. Don't have a BD-RE burner yet, that coulda saved the HDD's shiny, metallic ass.
Oh if it's just the FAT-table that's gone missing, you shouldn't have too much trouble recovering your data... Now i don't know wich programs they are, and i don't know if RIPlinux does it too, but there are tools wich brush over the fact that the FAT is missing and read what's physically on the drive, allowing you to copy it all...
DAMN YOU WORLD
DAMN YOU EVERYTHING
MY DELTA IS BROKEN BEYOND MY SKILLS OF REPARATION
Time to buy myself a HDD cabinet and an UMPC.
NejinOniwa, how did you break it? I'm wondering, since it's possible that I may help you out. (I have messed around with the computer long enough to know)
BTW, Smokey, I shouldn't worry too much about trying to recover anything, I will have to worry about it later.
Yeah, it might actually help to say what the problem is... ^_^
Strange BIOS bleeps, no video etc...
Yep. There are 3 (4 if using newer video card) most common problems: Caked-up CPU heatsink (fried CPU or forced thermal trip), messed-up OS(es)/boot sector, power supply failure (fuse is largely to be blamed), newer video card not getting power from IDE/PCIe power plug AND/OR caked-up GPU heatsink.
Now that's it or so I think.
Faulty RAM module or destroyed MoBo can also be causes... Where a faulty DIMM can be easily replaced/removed for the system to work again...
BTW, you haven't been flashing your BIOS, have you? cause that is also a nice way to murder your system...
The cause of the break is much more...material than that, I fear. I was messing around with the graphics card earlier, and I think I might have snapped something, very wrongly...
Enough about that, though. I'm just off to return&replace department of the place I bought it tomorrow -w-
Oh, you have warranty on the complete system?
Why didn't you say so before?! RMA the thing, let them sort it out... ^_^
I dunno about the whole system, but the mobo has, at least (the store I bought some parts from went down in fall - they're up again now, but I dunno). So yeah, kinda good.
Also RMA the vidcard, or test the MoBo with another vidcard and/or the vidcard on another mobo to see what part is actually broken...
Like I said, the store that sold me the vidcard went down and out for 6 months...I doubt my order history is still there and valid.
That would be a problem trying to get it replaced.
Nowaday, the current video processor chip (from '04 to present) are much nastier than its predecessor. They're packaged like Pentium III or Athlon K7, namely FCBGA (Flip-chip Ball Grid Array) - their die is visibly exposed. It's easy to kill that, if it don't have heatsink shim, and you accidentally flip the heatsink and you hear the classic snap, crackle and pop. It's done. What's also scary, the north/southbridge chipsets are also packaged the same way.
Then it's more likely to be one of the mobo things - my graphics card is damn right wrapped in hueg loads of FAN, so don't worry -w-
Well it doesn't have to bee cooked to be broken... you can always check to see if it still works...
No, I mean, it's wrapped up in too much plastic for me to be able to touch anything below it... -w-;
The video processor chip's fine, huh?
I'm thinking that you may have accidentally fractured the Northbridge chipset (test mobo out by doing power cycling: if it doesn't turn off after 4 seconds and the button's still held down, it means the chipset or CPU's cooked.) OR you may have somehow got the parts to fall off the video card. (I doubt that you may have bent the video card.)
Okay, inspect the video card and motherboard carefully - however, if you have the spare parts, it might help making that short.
Like I said, since I've got the vidcard from a now-defunct store, there's little use doing anything other than just return the mobo and hope for the best.
Then again, except for the money I wouldn't mind a new graphics card, since the one I have seems to be a bit swooshy about itself. Maybe an Nvidia card this time?
On that topic, do you prefer ATI or Nvidia vidcards?
I would recommend Radeon HD, because it's already open-sourced and is relatively easy on power supply but still have crazy horsepower at the same time, while 40nm Radeon HD 4890 is pretty close to two TRILLIONS operations per seconds (1.3 - 1.6 TFLOPS)
And if you don't care about money, go with NVIDIA GeForce GTX2x.
I think it's better to try motherboard first before buying the fancy pimped-out video cards. They're expensive nowaday.
Well, i was a nVidia fan, but i would go for an ATI card now that they've been bought by AMD...
BTW what vidcard do you have now? GeForce 8***/ 9***?
Radeon 4870 512mb...one of the first releases, from Sapphire. As long as I can keep it cool enough, it's bad ass...
That's good enough.
Also I'm wondering if anybody know how to hack the Linux to toss the video data through the HDMI (DVI-D) through a projector possibly with a f***ed-up EDID... The reason I ask was because I tried few command to force the TI Pico projector (development verison, the one that go with Beagle board) - I heard few people succeed in hacking their x86-compatible GPU to display anything from Pico on the x86 Linux.
Pleeeease help me out!
(BTW, I'm using a 64-bit Mandriva Free 2009)
Added after 7 minutes:
P.S. If anybody got their hands on Beagle board and have a TI Pico projector running off it, and I would like it if you would give me a driver for it.
Also I'm starting a thread on both Beagle board AND of course, TI Pico projector too.
QuoteAs long as I can keep it cool enough, it's bad ass...
True for every component... RAM, CPU, GPU, GRAM, North/Southbridge, HDD...
Although i wonder if harddisks can be overclocked... ^_^
BTW i don't know how to hack linux to do such things, i also didn't know that it's even possible to "hack" linux...
Quote from: "Smokey"BTW i don't know how to hack linux to do such things, i also didn't know that it's even possible to "hack" linux...
I wouldn't know how to do it either, but "hacking" your own OS is more or less a part of being a "power user" in Linux. Especially so with an OS like Gentoo, where you have to compile everything.
My only suggestion to Dr. Mario would be to search Mandriva's forums, if you haven't already. If people have done it before, surely there is a HowTo online somewhere that could aid you.
Yes, there may be a how-to guide. And, NO - there are no such easy way out, because you have to fire off special code, via the I2C out of HDMI already on-board, to set the TI Pico projector to run off VGA data, in that case, 640x480 display data.
Another issue is, the Pico projector absolutely no EDID serial flash connected to DVI host interface microcontroller (TRUST ME! I have disassembled the projector to put Arctic Silver 5 paste upon LEDs. The only flash was for holding the GPU's firmware, -
Added after 12 minutes:
- although it's user-reprogrammable via MSP430 programmer port), hence upping the ante in programming this device to acting like a monitor, regardless of absence of EDID data. Its original intention was to be a development kit, but because of its apparent open-source usage, it's also a popular add-on for Beagleboard, also open-sourced. (You can also make Beagleboard yourself too)
Oh well, I will figure it out somehow.
It was and still is awesome to have something that small and portable.
We might see HVD in the future:
Seen that some time ago already, and found this 50 Terrabyte disc (http://en.wikipedia.org/wiki/Protein-coated_disc) more interesting... ^_^
Yes, it's interesting. (and my company, I think, is playing with 22 Petabyte storage capacity off the electromorphive scintillator crystal cube). And, sorry I haven't been online for two months, hot weather have been messing WiFi up... Smokey, got any first light yet?
Heh, yeah i did...
Damn near took my screen out (and my eye)...
Apparently i wasn't cooling it enough, and it gives quite little warning before the rod fails...
Too bad, i wanted to leave a working laser before i started my basic training (only have one week left before i join the airforce)...
oh well, wanted to replace it with a NdGlass rod anyway...
;001 That's a mighty laser! I'm curious, what lamp did you pump it with? (Ruby laser, if stirred well, can put out 10 MegaWatts pulsed output - roughly compared to 100kW CW.)
I'm thinking about building a X-ray laser. I got parts being collected up as I'm either bugging for free sample and/or buying couple from www.mouser.com - Inverter I'm build for the laser have to spit out 10 Amps at 10,000 Volts AC. I wouldn't worry much, I'm using a PowerPC microcontroller for power management and safety protocol.
Heh, well i couldn't scrounge up the right LEDs so i "borrowed" the halogen lamps from my ceiling light...
Guess i really do have to beef up the cooling system... And realign the optics, since the output was...well...not what i expected...
That's kind of bad idea... Ruby laser, as I found out on Sam's laser FAQ, hates Halogen lamp and being operated in CW.. At least Nd:YAG's bit more forgiving. (Near-UV [404nm] LED being pulsed, to pump the ruby rod, may be the best bet.)
And, yes - you can use PC fans, but remember, if you wanna run an Argon-Ion laser, you will want minimum of 250 CFM. Yeah, yeah, I know the fan will be loud, but it's totally worth it.
Well, i don't mind the noise... i just have to build a wind tunnel, so the fans aren't loosely blowing over the rod...
It would be good, but any hot spot... Ruby straites too easily, and it will end up having spectular COD... Including having sharpnels buried into the wall! =x=
My X-ray laser to be constructed, would have fins all over on the tube, like what you have seen on the microwave oven magnetron tube. And lotsa PC fans (power effeciency reason)!
BTW, have anyone heard about the event in East Helena, Montana, USA on 14th of August?
Hmm, cooling fins are a good idea, yes...
And i haven't the slightest clue about what happebned on the 14th...
Perhaps something to ask in topicless...
Yes. I'm going to use Copper, and possibly Beryllium Oxide (BeO) for the bore. But, kids, DON'T try this at home! BeO is nasty, but I have worked on it before, always cut it while immersed in water to prevent the dust from getting airborne.
BTW, I was only asking in a way, since I felt that there may be some forum users here who ARE Helenans. I'm a Helenan. I will explain.
Added after 59 minutes:
Here's the pictures of ASARCO in its former glory.
Added after 11 minutes:
ASARCO's smokestacks were the most important landmark in East Helena. Why, you ask. Because ASARCO was the company that established its hometown, East Helena. It was where its workers used to call home. And ASARCO was here for over a century.
Now, on 14th of August, the landmark had to go bye-bye. I have witnessed the whole thing, the ka-boom and thud. Why destroy the smokestacks? Blame it on EPA.
[attachment deleted by admin]
Yeah it's a shame to see landmarks get demolished... In the town i was born in, they plan to demolish the iconic bell tower wich houses the city's carillion on the city hall's square to make room for a road and a gaudy new city hall... well that and they've built an apartment complex on the market square...
Yeah, it sucks to lose an important history piece. What I hate about demolitions, is that they're usually funded by crybabies who care little about the history. My Japanese wife was pissed off about what have been happening in my hometown (Helena) - she value the history like she do water. I understand that, but there's no stopping the dynamite-totting bastards, anyways.
T_T Rest in peace...
Ya, there was a lot of fuss about the demolishing of the old railroad workers' quarters down at the tracks, but I couldn't bother caring much, just a little...
But one has to say one thing, though.
ALL HAIL THE LORD OF DESTRUCTION, ALFRED NOBEL.
;020 Apparently true...
If destructions never happen, if ever - then the law of physic would get so f**ked up that ABSOLUTE IMPOSSIBLITY would be a everyday reality.
A neko exploding then regenerating itself out of nothingness?
Now, that would be f**ked up!
SchrÃ¶dingers cat? A cat that is in both dead and alive untill an observer appears?
Suddenly, SchrÃ¶dinger, EVERYWHERE!
And the Crouching Flying Fang Tiger Ninja clan!
What if OSX-kun actually exists and is planning to take out M$? That definitely would make reading the news more strange. (Hey, at least Bill doesn't have time to read the news, does he?)
Oh yeah... Intel made their Nehalem families of the CPUs Smaller... Smaller... Smaller... Smaller... Smaller...
They have been releasing the 32nm CPUs lately.
Um, AMD's CPU will sill be superior, anyways.
They said Nehalem is superior? MEH! They would eventually find out all of that benchmarks aren't all that honest (even I don't trust Intel's numbers. Plain LIES.)
Some gamers being impressed with Intel Core i7 will not be so impressed once they finds out that RISC architecture actually have a upper hand over CISC ones (Crysis proves that facts once again.)
AMD will start selling the 32nm Phenom II next year. They will be the speed monster that will scare Intel away. (last year, AMD broke Phenom II's speed record, more than double: it ran at 6.5GHz clock, ALL FOUR CORES!)
Eh? Guess my thread's as dead as a meat... I guess I will have to remove the thread soon...
However, anyone are still welcome here as well as for the time beings.
If revived, I will just leave it alive here.
;014 Been a while, and this threads and other of my threads are already post-mortem... Ah, what the hell...
Anyways, I'm thinking about rebuilding a laptop and put in PowerPC G4 (or G6-7 if I could find a good one...) and flash a firmware containing basic BIOS callbacks and an x86 interpreter software so I would just enjoy around surfing websites, like YouTube, only more efficient and faster (not having to gamble all of the horsepower on the battery life, something that PowerPC is now good at.) It would kinda be like bring Transmeta Crouse CPU back from death, only stronger and faster. (If you guys are wondering what I'm going to do with Altivec, I may as well as reassign that SIMD FPU for x86 SIMD (MMX, 3DNOW! Professional, SSE1 - 4, maybe SSE5 if there's a 256-bit Altivec 3.0 VLIW FPU which is now present on Power7 processors)