# Vega's *Heavyweight* display and computer; edition 2012



## CallsignVega

What complete computer gaming system could _literally_ weigh half of a ton (not including the liquid)?

The build commences...










+










+










+










+










+










= ?

Setups of past:























































http://www.youtube.com/user/CallsignVega/videos



http://hardforum.com/showthread.php?t=1588587

With an almost mathematical certainty, I am building the worlds only 3x FW900 portrait, no bezel gap, seamless Fresnel lens, Ivy Bridge, Quad Kepler, three user select-able cooling loop (Ambient, phase-change, Geothermal), gaming computer.

Your run of the mill system.


----------



## CallsignVega

Reserved


----------



## trinh

Holy..


----------



## derickwm

You have my attention.


----------



## malikq86

what monitor is that one the bottom 2 pics?...the bezel looks crazy thin....


----------



## De-Zant

Quote:


> Originally Posted by *malikq86*
> 
> what monitor is that one the bottom 2 pics?...the bezel looks crazy thin....


It's the samsung s23a750d with bezels removed. No native portrait mode support though, so be prepared to have to mod it a bit if you want that setup









Vega, you planning on fresnel lenses for the FW900s? Or?


----------



## csm725

Subbed for epic noobiness.
loljk.


----------



## Lostcase

So im on a shuttle to MEPS and decided to check out some overclock.net. Jesus Vega, this thread just made my day.

Sent from my iPhone using Tapatalk


----------



## CallsignVega

Quote:


> Originally Posted by *De-Zant*
> 
> It's the samsung s23a750d with bezels removed. No native portrait mode support though, so be prepared to have to mod it a bit if you want that setup
> 
> 
> 
> 
> 
> 
> 
> 
> Vega, you planning on fresnel lenses for the FW900s? Or?


Testing various technologies as we speak.
Quote:


> Originally Posted by *Lostcase*
> 
> So im on a shuttle to MEPS and decided to check out some overclock.net. Jesus Vega, this thread just made my day.
> Sent from my iPhone using Tapatalk


That galvanized steel drill bit in picture #4 was harder to make than you would think! I just hope it doesn't create too much collateral damage on the FW900's.


----------



## DevilDriver

subed!


----------



## Alan1187

Subbed here and on your youtube. That way I know I wont miss anything lol.


----------



## PvtHudson

Hey Vega,

I saw your discussion and screenshots over on [H]ardOCP and originally when you got the FW900 you seemed a bit disappointed. You compared the image quality to your Samsung monitors and the Samsung "won" in almost every category but a few posts later you began saying how the FW900 blows all LCDs out of the water. I'm curious as to what made your opinion change. Was there something you tweaked or was the calibration off?


----------



## CallsignVega

Quote:


> Originally Posted by *PvtHudson*
> 
> Hey Vega,
> I saw your discussion and screenshots over on [H]ardOCP and originally when you got the FW900 you seemed a bit disappointed. You compared the image quality to your Samsung monitors and the Samsung "won" in almost every category but a few posts later you began saying how the FW900 blows all LCDs out of the water. I'm curious as to what made your opinion change. Was there something you tweaked or was the calibration off?


I didn't have the FW900 calibrated nor did I have a monitor hood (CRT's don't like ambient light). Later in that thread I posted the FW900 versus Apple 27" cinema display (which I think has the best LCD picture out there) and the FW900 won in static image tests. For motion the Apple wasn't even on the same planet as the FW900. Take a look at that post.


----------



## golfergolfer

I don't really see how much better this could be but SUBBED!!! Cant wait to see what you do


----------



## CallsignVega

Quote:


> Originally Posted by *golfergolfer*
> 
> I don't really see how much better this could be but SUBBED!!! Cant wait to see what you do


You must never have seen the FW900 in action! Now imagine 4320x2304 resolution (10 million-pixels) over a 52" image all at CRT smoothness/zero motion blur with a seamless image (no bezels)?


----------



## csm725

I am excite.


----------



## armartins

Sub'd! WARNING: Get ready for the environmental friendly folks bashing your CRT setup due to power bill and even nocive CRT particle emission...


----------



## CallsignVega

Quote:


> Originally Posted by *armartins*
> 
> Sub'd! WARNING: Get ready for the environmental friendly folks bashing your CRT setup due to power bill and even nocive CRT particle emission...


But to make up for it, one of my three select-able cooling loops will be Geo-thermal. Very green! Oops, I gave away picture # 4.


----------



## CallsignVega

Here is my proposed simple cooling system. Let me know if you guys can see any way to make it more efficient.










Here I have started the Geo-thermal drilling. This is the top 3 feet (17 feet underground). I've drilled near the home slab foundation and in a year-round shaded area to maximize Earth temperature stability. So far it looks like the ground water level is 5-6 feet below the surface which is GREAT. Finally a positive about living in the South.


----------



## CallsignVega

Woops, I just realized I placed the RAM slots like X79 and not Z77. Actually, Z77 will even be easier to set up the cooling loops.


----------



## JedixJarf

Quote:


> Originally Posted by *CallsignVega*
> 
> Woops, I just realized I placed the RAM slots like X79 and not Z77. Actually, Z77 will even be easier to set up the cooling loops.


I was actually just about to mention that


----------



## PARTON

sub'd


----------



## dtfgator

Wow, subbed for sure. May I ask where you are getting the money for all of this? Sure to be one hell of a kick ass build.


----------



## PARTON

Quote:


> Originally Posted by *dtfgator*
> 
> Wow, subbed for sure. May I ask where you are getting the money for all of this? Sure to be one hell of a kick ass build.


I'm guessing the government pays him for his time.


----------



## TheBadBull

wow. enthusiast + lots of cash = EPIC
about right?


----------



## Blostorm

BRB changing pants.


----------



## legoman786

In!


----------



## CallsignVega

For those interested in the Fesnel display setup: whenever you add a lens, some degradation will occur. Although, during my proof-of-concept testing, I've found the trade-offs well worth it. You will get a slight decrease in image clarity with regard to slight chromatic abberations and on bright scenes seeing some of the circular grooves cut in the lens. But you gain magnification and the ability to eliminate bezels. It has actually done quite well in my testing and allowed me to go into full project mode.

Take a look at this video:

http://www.youtube.com/watch?feature=player_detailpage&v=TrhGiVl1RNs#t=337s

Now if he were to position the camera exactly where his eyesight was, there would be virtually zero gap between the images as you can see in this video:

http://www.youtube.com/watch?v=bnNM-N_1ve8

If done properly, the gap between images will be 1mm or less.


----------



## Manyak

I've got my doubts about your Fresnel lens idea when it comes to image quality....but I guess we'll have to see


----------



## Bit_reaper

Wow just wow







The Fresnel lens idea is very neat.


----------



## phillyd

this...
is beyond ridiculous.
im jelly


----------



## CallsignVega

Quote:


> Originally Posted by *Manyak*
> 
> I've got my doubts about your Fresnel lens idea when it comes to image quality....but I guess we'll have to see


I had MAJOR doubts about the Fresnel when I was first looking into it. I've been running a high-quality lens now on my FW900 for proof-of-concept for over a week. I took off the Fresnel and I wanted to put it back on! It made the 22.5" view-able appear to be around 28". Was much easier to see enemies in fights etc. Plus when I add three of them for a huge image and sit further back, the chromatic abberation effect and the circular lens cuts are even harder to see/notice. The feel of "depth" of the lens is also interesting, it is amazing how fast your brain adapts to something like a lens in front of your screen and you barely notice it anymore.

So you get slight image degradation for a larger image and near-seamless bezels. It will be interesting indeed but a crap load of work that's for sure.


----------



## Manyak

Quote:


> Originally Posted by *CallsignVega*
> 
> I had MAJOR doubts about the Fresnel when I was first looking into it. I've been running a high-quality lens now on my FW900 for proof-of-concept for over a week. I took off the Fresnel and I wanted to put it back on! It made the 22.5" view-able appear to be around 28". Was much easier to see enemies in fights etc. Plus when I add three of them for a huge image and sit further back, the chromatic abberation effect and the circular lens cuts are even harder to see/notice. The feel of "depth" of the lens is also interesting, it is amazing how fast your brain adapts to something like a lens in front of your screen and you barely notice it anymore.


This sounds like something I need to try and see for myself









But hey...have you tried pushing the FW900 to 2560x1600 with that thing on yet? Yeah the grill can't fully resolve that resolution but it might still make gaming absolutely awesome


----------



## CallsignVega

Quote:


> Originally Posted by *Manyak*
> 
> This sounds like something I need to try and see for myself
> 
> 
> 
> 
> 
> 
> 
> 
> But hey...have you tried pushing the FW900 to 2560x1600 with that thing on yet? Yeah the grill can't fully resolve that resolution but it might still make gaming absolutely awesome


The resolution would look good but the maximum refresh rate would be too low to use (vertical 1600 being the limiting factor).


----------



## PoopaScoopa

Did you get rid of the Tahitis because of the 5 display Eyefinity bugs?


----------



## Manyak

Quote:


> Originally Posted by *CallsignVega*
> 
> The resolution would look good but the maximum refresh rate would be too low to use (vertical 1600 being the limiting factor).


Max refresh rate would be 60Hz. That monitor _can_ pull it off.


----------



## CallsignVega

Quote:


> Originally Posted by *PoopaScoopa*
> 
> Did you get rid of the Tahitis because of the 5 display Eyefinity bugs?


Ya, 5-Eyefinity did not work out too well. Only a few games out of my whole library worked right. Not to mention I feel Kepler will be faster, nVidia has traditionally had better/smoother multi-screen / multi-GPU and 3x FW900 Surround will be epic. It also can only be done on nVidia as each monitor plugs into a different GPU and has it's own RAMDAC. A much better setup IMO versus AMD in which you have to plug all displays into a single card when using any number of cards.








Quote:


> Originally Posted by *Manyak*
> 
> Max refresh rate would be 60Hz. That monitor _can_ pull it off.


Ya, 60Hz would make my eyes bleed.


----------



## PoopaScoopa

Quote:


> Originally Posted by *CallsignVega*
> 
> Ya, 5-Eyefinity did not work out too well. Only a few games out of my whole library worked right. Not to mention I feel Kepler will be faster, nVidia has traditionally had better/smoother multi-screen / multi-GPU and 3x FW900 Surround will be epic. It also can only be done on nVidia as each monitor plugs into a different GPU and has it's own RAMDAC. A much better setup IMO versus AMD in which you have to plug all displays into a single card when using any number of cards.


Yeah, from your previous triple 30", you showed that 4-way CrossfireX serial bandwidth running off the first card couldn't handle the load. Did you ever try it with just the two 6990s? I thought you still had to connect monitors between GPU 1 and 2 for Nvidia, I didn't know you could connect the 3rd on GPU 3. Kinda sad how AMD 5 screen doesn't work when that's their flagship marketing.


----------



## Paradigm84

I understand practically none of what these posts are on about, but it's impressive as hell regardless. Subbed x10000000.


----------



## CallsignVega

Quote:


> Originally Posted by *PoopaScoopa*
> 
> Yeah, from your previous triple 30", you showed that 4-way CrossfireX serial bandwidth running off the first card couldn't handle the load. Did you ever try it with just the two 6990s? I thought you still had to connect monitors between GPU 1 and 2 for Nvidia, I didn't know you could connect the 3rd on GPU 3. Kinda sad how AMD 5 screen doesn't work when that's their flagship marketing.


2x 6990's worked fine for single screen but failed pretty bad at Eyefinity.

Yes, With nVidia you can connect monitor #1 to card #1, monitor #2 to card #2 and down the line... 3 to 3 or even 3 to 4.

Got a few components in:










XSPC Raystorm full copper CPU block. Looks and feels like an awesome piece. Watercool MO-RA3 Pro radiator with 9x 140mm XSPC fans for my ambient loop section. This thing is built like a tank and has traditional German quality.

I've gone with EVGA Ultra Classified 580's until Kepler releases. Who know's, I might even be able to use EVGA's 90-day step up program on Kepler. I've always liked EVGA in the nVidia camp. Also tossed in there a Bitspower reservoir for my silver kill coils, fluid level viewing and fluid access port. I plan to run distilled water with silver kill coils, IandH Dead-Water Copper Sulfate and Water Wetter.


----------



## csm725

Nice. Looking forward to Kepler.


----------



## PoopaScoopa

When you had just 3 monitors, was it still bad? I thought it was just 5 monitors that had trouble.

Also, EK has stated that copper sulfate additive(pt nuke) will cause corrosion and they won't honor RMAs if you used it. You already have silver kill coil to handle for biocide. Are you sure it's safe? Especially with the aluminum in your radiator and bare copper CPU block too. Water wetter will add gunk and staining if you're ok with that.


----------



## CallsignVega

Quote:


> Originally Posted by *PoopaScoopa*
> 
> When you had just 3 monitors, was it still bad? I thought it was just 5 monitors that had trouble.
> Also, EK has stated that copper sulfate additive(pt nuke) will cause corrosion and they won't honor RMAs if you used it. You already have silver kill coil to handle the growth. Are you sure it's safe? Especially with the aluminum in your radiator too. Water wetter will add gunk and staining if you're ok with that.


Ya, just 3 was a no-go.

I will look into the copper sulfate, I haven't bought it yet. Maybe I will just run the silver kill coil. I've had great success with Water Wetter before.


----------



## PoopaScoopa

You could try Feser base corrosion inhibitor and Nuke PHN(no cop sulf) biocide which works great for distilled water. Might want to ask someone who's used alum+bare cop if the corrosion inhibitor will be enough as those two metals shouldn't be in the same loop.

May I ask what was wrong with the 3 monitors? I plan on getting a triple setup in the summer. They looked good on your yt videos.


----------



## Citra

You would...

Sub'd.


----------



## travva

EPIC. Definitely sub'd. Good luck dude, I can't wait to see the end result.


----------



## CallsignVega

Quote:


> Originally Posted by *PoopaScoopa*
> 
> You could try Feser base corrosion inhibitor and Nuke PHN(no cop sulf) biocide which works great for distilled water. Might want to ask someone who's used alum+bare cop if the corrosion inhibitor will be enough as those two metals shouldn't be in the same loop.
> 
> May I ask what was wrong with the 3 monitors? I plan on getting a triple setup in the summer. They looked good on your yt videos.


Are you referring to the 2x 6990 or the 2x 7970? The 2x 7970 were much better in 3x monitors. FW900 is just far greater than any LCD hence why I am doing this setup and going with nVidia. The ultimate smoothness and clarity for the win.

On another note, I just finished my custom reinforced 7 foot stainless steel desk to hold my monster display setup. I stood on this right in the middle on one of my feet (180lb) and it didn't even flex a hair.










Welding stainless steel is harder than you think! (just kidding). Time for the old glass desk to go!


----------



## dtfgator

Can we see a picture of the monitors and lens?


----------



## CallsignVega

Quote:


> Originally Posted by *dtfgator*
> 
> Can we see a picture of the monitors and lens?


I don't have them all set up yet. In the above desk picture you can see the solo FW900 I am using at the moment with a monitor hood and a Fresnel lens in front of it.


----------



## csm725

Wow. Are you going to get a deskmat for your peripherals or are you not into that sort of stuff?


----------



## CallsignVega

No, I will use a keyboard/tray under the desk.


----------



## csm725

Mechanical?


----------



## CallsignVega

Quote:


> Originally Posted by *csm725*
> 
> Mechanical?


If you mean a tray that mechanically slides in and out under the desk to support the keyboard/G13/Trackman, then yes. If you mean a mechanical key keyboard, then no. I strangely like my Apple chick-let keyboard lol. I don't do a whole crap load of typing anyway and I use the G13 for all my gaming.


----------



## csm725

I was referring to a KB. Hmmm, fair enough. If you want something similar to the G13, there are Topre switch keyboards, but some say even those are snake oil...
I assume you've tried mechanical KBs in the past?


----------



## CallsignVega

Quote:


> Originally Posted by *csm725*
> 
> I was referring to a KB. Hmmm, fair enough. If you want something similar to the G13, there are Topre switch keyboards, but some say even those are snake oil...
> I assume you've tried mechanical KBs in the past?


Yes. They are nice but I don't do a whole lot of typing like some people in which it would be more beneficial.

On another note: Does anyone have experience working with any of these materials: acrylic, polycarbonate, CR-39 or polyurethane? I have contacted my Fresnel lens manufacturer to see if he could tell me what material they are. I am interested in trimming them to fit my display setup perfectly. Curious if the "score and snap" method would work or is cutting involved.


----------



## Manyak

Quote:


> Originally Posted by *CallsignVega*
> 
> Yes. They are nice but I don't do a whole lot of typing like some people in which it would be more beneficial.
> On another note: Does anyone have experience working with any of these materials: acrylic, polycarbonate, CR-39 or polyurethane? I have contacted my Fresnel lens manufacturer to see if he could tell me what material they are. I am interested in trimming them to fit my display setup perfectly. Curious if the "score and snap" method would work or is cutting involved.


I have with acrylic. It really depends on the thickness and type of acrylic, and either way I advise against it because you're going to have grooves in it from being a Fresnel lens anyway so you never know what might happen with the snap. The most reliable way to do it is to use a scroll saw at mid speed, or a table saw with a high tooth count and a thin blade (in other words: watch the heat). Then sand down the edge.


----------



## CallsignVega

I have an extra Fresnel that was pretty cheap to practice on. There are a few videos of people using the score nad snap method but I don't know how accurate the cuts/edges are. My cut edges will be completely visible and but up to the lens next to it so it has to be pretty damn good.


----------



## CallsignVega

The stainless steel desk has exceeded my expectations in both looks and strength. Really high quality and USA made. This thing will last a lifetime.


----------



## bigkahuna360

Does the government pay for all this?!?!


----------



## dtfgator

Quote:


> Originally Posted by *bigkahuna360*
> 
> Does the government pay for all this?!?!


I sure as hell hope not. What do you do for work, Vega?


----------



## iCrap

Subbed! Can't wait to see how this turns out... So do you have all three monitors yet?
Also, sortof off topic, but what is so good about those speakers you have that makes them $1000?


----------



## CallsignVega

Quote:


> Originally Posted by *dtfgator*
> 
> I sure as hell hope not. What do you do for work, Vega?


Military Pilot
Quote:


> Originally Posted by *iCrap*
> 
> Subbed! Can't wait to see how this turns out... So do you have all three monitors yet?
> Also, sortof off topic, but what is so good about those speakers you have that makes them $1000?


I have the monitors but I am waiting got the other two Fresnel lenses. As for the speakers and their cost, they sound good.


----------



## CallsignVega

Just a quick look at a Fresnel lens in front of a FW900 during testing. The 22.6" view-able appears to be around a 26-27" with the lens:

http://youtu.be/rp8sIip-jdc


----------



## CallsignVega

I've just read this new article and it appears nVidia won't let 1155 do Quad-SLI:

http://vr-zone.com/articles/detailed-hands-on-pictures-of-gigabyte-s-g1.sniper-3/15100-2.html

So that means I'll have to go with X79 for Quad-SLI using single GPU cards or get two dual GPU Kepler like the GTX 690/790 whatever they are going to call it. If nVidia fixed the power problem on the new dual cards, if I water cool them I might get 95% of the performance for half the cost. We shall see. So maybe my X79 water block diagram was a prophecy!

I'd be more comfortable jumping into X79 if I knew there was a guarantee and not just rumors about Ivy Bridge-E fully supporting X79.


----------



## Nocturin

Wow. Well, atleast we know what you do in your downtime







.

Considering your multi-switch loop thing, what valves are you using to switch? Is this going to be a servo jerry-rig or are you actually using some automatic pool/spa valves, and will your pump be able to withstand the back pressure during the switch without shattering the impeller/burning up the motor?

Your in SC? Where abouts? I'm in the upstate.

edit: supercalifragilicoius subbed


----------



## iCrap

Quote:


> Originally Posted by *CallsignVega*
> 
> Just a quick look at a Fresnel lens in front of a FW900 during testing. The 22.6" view-able appears to be around a 26-27" with the lens:
> http://youtu.be/rp8sIip-jdc


It's hard to tell from your video, but does the Fresnel lens make the image discolored or distorted at all? I know I've seen them places before and i remember discoloration / weird effects. Does this happen?


----------



## CallsignVega

Quote:


> Originally Posted by *iCrap*
> 
> It's hard to tell from your video, but does the Fresnel lens make the image discolored or distorted at all? I know I've seen them places before and i remember discoloration / weird effects. Does this happen?


I wouldn't call it discolored. The only visual anomalies you will see are the circular cuts into the lens when viewing bright images and a slight amount of chromatic abberation on the sides. That just means the lens separates the colors slightly, kinda like convergence that is just a tad off and only really noticeable on long straight lines. You can compensate a little bit with the monitors convergence controls. A good Fresnel like I have made in the USA with very fine lens groves and designed for a widescreen format also has virtually zero distortion.

The larger image and not having any gaps between displays is well worth the slight image degradation.


----------



## Thecityskies

My jaw dropped.
Sub'd!


----------



## davidrt4

Where can i buy that desk or is it custom?
Subbed


----------



## PoopaScoopa

Quote:


> Originally Posted by *CallsignVega*
> 
> I've just read this new article and it appears nVidia won't let 1155 do Quad-SLI:
> http://vr-zone.com/articles/detailed-hands-on-pictures-of-gigabyte-s-g1.sniper-3/15100-2.html
> So that means I'll have to go with X79 for Quad-SLI using single GPU cards or get two dual GPU Kepler like the GTX 690/790 whatever they are going to call it. If nVidia fixed the power problem on the new dual cards, if I water cool them I might get 95% of the performance for half the cost. We shall see. So maybe my X79 water block diagram was a prophecy!
> I'd be more comfortable jumping into X79 if I knew there was a guarantee and not just rumors about Ivy Bridge-E fully supporting X79.


Even the MSI Big Bang Marshall P67 can't run dual 590s It's been a problem with Nvidia for awhile. They only recently allowed SLI on AMD chipset mobos. 3-way SLI requires one NF200. 4-way requires two NF200 chips and even 590 SLI still requires a NF200 chip on the mobo when the cards already have them onboard. Why? Licensing money grab as each NF200 makes them ~ $5.

The bandwidth between the PCIe lanes and the CPU does not change just because they add a NF200. That just adds latency between the connectors. According to the Tweaktown review, @ 1080P NF200 causes 2fps loss and @ 1600P causes 1fps loss.
Quote:


> http://www.guru3d.com/article/triple-monitor-gaming-on-geforce-gtx-590-and-radeon-6990/16
> Where NVIDIA confused me personally was the fact that initially on a P67 Big Bang Marshal (SLI compatible) motherboard we *could not get the two GTX 590 cards running in SLI* mode. After spending a couple of hours trying to resolve this issue it suddenly hit me. NVIDIA *requires* an *NF200* chip (maybe even two) *on the motherboard for Quad-SLI*. Only then will the quad-SLI mode in the GeForce driver open up.
> 
> Now *each of the GTX 590 cards* *already has a NF200 chip* sitting in-between the two GPUs, so in theory there's no need for an additional NF200 IC on the motherboard as each card will get x8 PCIe lanes assigned to them. Still, NVIDIA requires you to purchase an expensive motherboard to get that supported. I'm sorry but that sounds like (not trying to sound like an asshat here) corporate greed and we feel it's something completely unnecessary. Heck it works fine for AMD as well right ? We do hope that NVIDIA will drop this driver limitation, as really that's what it in the end boils down to. So, you are required to purchase a (quad) SLI motherboard with NF200 bridge chip, something like the eVGA X58 Quad SLI or the Gigabyte X58 UD9, both roughly 500 EUR.


With Sandy Bridge and on, we'd have to hack the Bios/EFI and add whatever the drivers are looking for to get SLI to work on boards not certified where as with X38 and below/AMD boards, we could simply hack the drivers.


----------



## CallsignVega

Wow, I thought AMD was a nightmare to set up Eyefinity. I've been trying for many hours now to get my two 580's into Surround and I am about to pull my hair out. When I go to configure surround it just says "connect displays" and lists zero displays connected. So freaking frustrating.

Randomly selects resolutions etc. Hit my first major snag in this build, hope it doesn't take me out. Multi-GPU/Multi-display setups in both camps are such a HUGE pain in the ass.


----------



## csm725

I hate multi-GPU.


----------



## tsm106

Quote:


> Originally Posted by *CallsignVega*
> 
> Wow, I thought AMD was a nightmare to set up Eyefinity. I've been trying for many hours now to get my two 580's into Surround and I am about to pull my hair out. When I go to configure surround it just says "connect displays" and lists zero displays connected. So freaking frustrating.
> Randomly selects resolutions etc. Hit my first major snag in this build, hope it doesn't take me out. Multi-GPU/Multi-display setups in both camps are such a HUGE pain in the ass.


Ah the joys of a difficult setup. Just keep your eye on the prize so to speak.


----------



## TA4K

Quote:


> Originally Posted by *tsm106*
> 
> Ah the joys of a difficult setup. Just keep your eye on the prize so to speak.


Make sure the prize has a Fresnel lens in front of it so it looks bigger


----------



## nvidiaftw12

I saw your build on [H] many months ago. It has inspired me greatly.


----------



## CallsignVega

Quote:


> Originally Posted by *tsm106*
> 
> Ah the joys of a difficult setup. Just keep your eye on the prize so to speak.


You don't know how close I came to destroying stuff!









Success! Well, at least a majority success. My setup during portrait nVidia Surround testing:










So during testing I kept wondering why it would not go into Surround mode, even when I was getting three green check marks on the Surround config page. I had all the same FW900 .inf's loaded, everything was a go. But when it actually went to put them into Surround it would fail. It would put it back into non-SLI "activate all displays" mode, with it then showing two FW900's and a "generic non-PNP monitor". I tried like eight different drivers, moving the computer around, switching cables and stuff. Was a super frustrating 8 hours.

So I thought OK, maybe nVidia is super anal and wants actual EDID information from every display. So I disconnected all of my BNC5 cables and used regular VGA cables that could pass EDID. Sure enough, that got it working! So nVidia Surround requires EDID information. Surround is working, but I am stuck at the standard resolution and refresh rates that the monitor sends out over EDID like [email protected], [email protected] It won't let me adjust those at all. Not a deal breaker but not ideal either. Plus I think BNC cables have a slightly better image, so maybe in the future I will do some crazy cable hack-job monstrosity in which I can use the BNC cables and pull EDID info at the same time. Suggestions welcome here! I don't even know if the FW900 will send out EDID info if the BNC cables are plugged in. Anyone know where I can get _super_ high quality VGA cables in 25 foot lengths? Or a way to breakout the VGA to my BNC cables yet still get the EDID signal to the GPU's?

Even with the huge bezel gaps during my testing, the FW900 portrait Surround setup is EPIC. You get super high resolution, ridiculous buttery smooth CRT motion, zero viewing angle problems, zero ghosting, zero back-light bleed or IPS glow or anti-glare coating problems that you get with LCD's. nVidia Surround/multi-GPU with the CRT's has no stuttering or any problems with any of the games I tested. FW900's work extremely well in portrait as you don't have to worry about pixel orientation, a change in viewing angles or any of that nonsense. Definitely motivated now to work on the bezel-less Fresnel lens portion of the project.


----------



## CallsignVega

http://www.connectpro.com/prod_vid_ddcDVI.html

That looks very interesting. Allows you to program and emulate EDID info and then I'd be able to pass my analog signal along over BNC.


----------



## eskamobob1

Quote:


> Originally Posted by *CallsignVega*
> 
> http://www.connectpro.com/prod_vid_ddcDVI.html
> That looks very interesting. Allows you to program and emulate EDID info and then I'd be able to pass my analog signal along over BNC.


lol... that just went straight over my head... but everything looks amazing so far


----------



## tsm106

Quote:


> Originally Posted by *CallsignVega*
> 
> http://www.connectpro.com/prod_vid_ddcDVI.html
> That looks very interesting. Allows you to program and emulate EDID info and then I'd be able to pass my analog signal along over BNC.


Wow, it's like a PAC SWI-PS for your monitor!


----------



## dtfgator

So you said you are a military pilot, eh?

From the amount of money you are tossing at this build, I have drawn 3 conclusions:

1. You magically convinced the military you need a setup like this for flight simulators and training at home
2. You test new birds or fly on some super stealth tactical classified black ops missions








3. You put yourself on a ramen-noodle a night budget to afford rigs like this










Did I guess right?


----------



## nvidiaftw12

So, vega, you have 4 of these? Maybe you wouldn't mind getting rid of one?
~a fellow Alabamian.


----------



## tsm106

Quote:


> Originally Posted by *dtfgator*
> 
> So you said you are a military pilot, eh?
> From the amount of money you are tossing at this build, I have drawn 3 conclusions:
> 1. You magically convinced the military you need a setup like this for flight simulators and training at home
> 2. You test new birds or fly on some super stealth tactical classified black ops missions
> 
> 
> 
> 
> 
> 
> 
> 
> 3. You put yourself on a ramen-noodle a night budget to afford rigs like this
> 
> 
> 
> 
> 
> 
> 
> 
> Did I guess right?


4. You have a Sugar Momma!!


----------



## PoopaScoopa

I'd say: 5. Not married and without kids.


----------



## Canis-X

This looks awesome Vegas!! Pushing the envelope and trying out new things....I love it! Thanks for taking the time to share your thoughts and findings!


----------



## CallsignVega

Quote:


> Originally Posted by *tsm106*
> 
> 4. You have a Sugar Momma!!


I wish.
Quote:


> Originally Posted by *PoopaScoopa*
> 
> I'd say: 5. Not married and without kids.


Bingo, aint no woman going to attach that ball and chain to my ankle!








Quote:


> Originally Posted by *Canis-X*
> 
> This looks awesome Vegas!! Pushing the envelope and trying out new things....I love it! Thanks for taking the time to share your thoughts and findings!


No problem, much more to come in the future.

Since the FW900 was designed for normal landscape mode and the heat rising, I've modded a super quiet Noctua fan to help move air while they are in portrait mode.










(And yes for you nit-pickers I meant for it to be slightly askew to align with the grate vent holes for mounting)


----------



## Ken1649

Vega, have you tried EDID Mask to trick the card?

Custom Resolutions on ATI / AMD Video Cards (Non-_EDID_ Method) -
_DisplayPort to DVI_ @_120hz_ | Widescreen Gaming Forum


----------



## csm725

+rep for some real modding going on here!


----------



## Hydroplane

Vega I just want to say that your setups are crazy, they inspire me to want to do something like this myself


----------



## JedixJarf

Quote:


> Originally Posted by *PoopaScoopa*
> 
> I'd say: 5. Not married and without kids.


Does not compute...

Hey Vega when are you going to start folding on that gear


----------



## Shrak

My god it's beautiful.


----------



## FPSandreas

subbed here and yt. That is some serious inspirationel work you are doing soldier boy!! xP


----------



## Draygonn

sub'd yet again


----------



## armartins

Ok... so how you plan to hold that resolution with only 1.5GB framebuffer until Kepler? As far as I recall Classified EVGA 580's don't came with 3GB, right?


----------



## Hellish

Quote:


> Originally Posted by *armartins*
> 
> Ok... so how you plan to hold that resolution with only 1.5GB framebuffer until Kepler? As far as I recall Classified EVGA 580's don't came with 3GB, right?


I dont understand how questions like this are even asked

http://lmgtfy.com/?q=EVGA+GTX+580+Classified+3GB

lastly he has the ultra so it is clearly 3GB all you have to do is check @ the egg to see they only come in 3GB.

I haven't seen a classified with 1.5GB for sale in a long ass time.


----------



## CallsignVega

I think I've found my motherboard for this build:










ASRock Z77 Extreme9. I hope they also mean true 4-way SLI (look at printed PCB and not tossed on heat-sink for display) and not just two dual GPU cards like some guys try to advertise as "Quad-SLI" capable. Quad-8x PCI-E 3.0 should be plenty fast for 4x Kepler, the same speed as Quad-16x PCI-E 2.0. This is the only Z77 board I've seen yet with the PLX PCI-E 3.0 lane multiplier chip. Look's like this may be my board for my build as long as ASRock doesn't screw up the BIOS.

Why not just go for X79 you may ask? Generally the 1155 chips like IB will overclock higher, be a lot cheaper, be faster in games clock for clock, run cooler, use less energy, and more than four cores is not used for any game that I know of.


----------



## Blizlake

You've chosen my dream board









Your opinion about 1155 vs. 2011 seems to be exactly the same as mine. If gaming is what the computer is used for, there's no point in going 2011 even if you've got loads of cash hanging around (rather send it to me







). Even the extra pcie lanes aren't probably needed due to increased bandwith on pcie 3.0. And even if some games/apps were to use more than 4 cores, you can invest on an i7 processor which will have hyperthreading (which is plenty for multithreaded programs).

Also, Subbed


----------



## dtfgator

That motherboard looks awesome! I usually thought of ASRock as the company that cranks out the cheap mATX motherboards, but between this and the Fatal1ty board they seem to have a really nice high end range too, one that gives the Rampage's a run for their money.

Glad to see that the build is coming along, please keep us updated!


----------



## armartins

Quote:


> Originally Posted by *Hellish*
> 
> I dont understand how questions like this are even asked
> http://lmgtfy.com/?q=EVGA+GTX+580+Classified+3GB
> lastly he has the ultra so it is clearly 3GB all you have to do is check @ the egg to see they only come in 3GB.
> I haven't seen a classified with 1.5GB for sale in a long ass time.


Sorry man, if you know so much! I'm really sorry for being out living in a bubble for some months (I really did) and all I remember was people complaining about the great clocking classified 580's not having 3GB versions. I didn't saw any reference to the ultra version, I've read all the thread, as usual, and Vega just said "got a few components in" and put a picture. I'm sorry again if I did not recognize at first sight or I've missed anywhere down the road the reference for the cards being ultra versions, to be honest I wasn't aware of the existence of that version, by the way that's the kind of info that I learn by reading forums like OCN. When I wrote *"as far as I recall"* you could infer that I could not be aware of the existence of such model. I really don't want to disturb this great thread (sorry Vega). Just please tone down, also for the way you have chosen to show your higher knowledge (by assuming it as mandatory, and throwing it on my face) you're being reported. If you have any further comment please send it straight to me and stop showing your nerd rage out there.


----------



## Hellish

Quote:


> Originally Posted by *armartins*
> 
> Sorry man, if you know so much! I'm really sorry for being out living in a bubble for some months (I really did) and all I remember was people complaining about the great clocking classified 580's not having 3GB versions. I didn't saw any reference to the ultra version, I've read all the thread, as usual, and Vega just said "got a few components in" and put a picture. I'm sorry again if I did not recognize at first sight or I've missed anywhere down the road the reference for the cards being ultra versions, to be honest I wasn't aware of the existence of that version, by the way that's the kind of info that I learn by reading forums like OCN. When I wrote *"as far as I recall"* you could infer that I could not be aware of the existence of such model. I really don't want to disturb this great thread (sorry Vega). Just please tone down, also for the way you have chosen to show your higher knowledge (by assuming it as mandatory, and throwing it on my face) you're being reported. If you have any further comment please send it straight to me and stop showing your nerd rage out there.


I loled, and the reason I said it like that is because you are asking it on the internet when google 9/10 times has the answer, but lmao report me big deal nothing is wrong with what I said to you, if anything it was too nice, ps dont pm me I dont want to hear from you.


----------



## Draygonn

Quote:


> Originally Posted by *armartins*
> 
> Ok... so how you plan to hold that resolution with only 1.5GB framebuffer until Kepler? As far as I recall Classified EVGA 580's don't came with 3GB, right?


1.5Gb for 3xFW900s?!? Have you seen any of Vega's previous mad scientist build threads? The guy knows what he's doing. And your response to Hellish has placed you on my ignore list.

I think I found one of Vega's other projects


----------



## iCrap

Quote:


> Originally Posted by *Draygonn*
> 
> 1.5Gb for 3xFW900s?!? Have you seen any of Vega's previous mad scientist build threads? The guy knows what he's doing. And your response to Hellish has placed you on my ignore list.
> I think I found one of Vega's other projects


^^ Lol.
Yea vega knows his stuff.

Waiting for more updates.


----------



## CallsignVega

Sweet! This ASRock guy confirmed the Z77 Extreme9 can do Quad-SLI at Quad-8x PCI-E 3.0 with the PLX chip in the video below. Me excite!


----------



## Hydroplane

Quote:


> Originally Posted by *CallsignVega*
> 
> Sweet! This ASRock guy confirmed the Z77 Extreme9 can do Quad-SLI at Quad-8x PCI-E 3.0 with the PLX chip in the video below. Me excite!


Awesome! Hopefully there won't be issues running 4 way SLI like there were with the NF200


----------



## Canis-X

Houston, we are good to go!


----------



## PoopaScoopa

Quote:


> Originally Posted by *CallsignVega*
> 
> Sweet! This ASRock guy confirmed the Z77 Extreme9 can do Quad-SLI at Quad-8x PCI-E 3.0 with the PLX chip in the video below. Me excite!


Wow, good to see Nvidia finally letting up on SLI requirements.


----------



## CallsignVega

Quote:


> Originally Posted by *PoopaScoopa*
> 
> Wow, good to see Nvidia finally letting up on SLI requirements.


Ya, they must have realized PCI-E 3.0 at even 8x speed would be plenty.

Ambient radiator fans all mounted and wiring soldered. These things push some serious air and pretty darn quiet to boot.


----------



## csm725

Which fans are those?


----------



## Eggy88

Quote:


> Originally Posted by *CallsignVega*
> 
> Ya, they must have realized PCI-E 3.0 at even 8x speed would be plenty.
> Ambient radiator fans all mounted and wiring soldered. These things push some serious air and pretty darn quiet to boot.


That is the Mo-Ra 9x140 right? What kind of results do you have with it? I'm looking to put together a bench rig and looking @ a Mo-Ra3 9x140 with 18x 140mm High speed yate Loons.


----------



## lannshine

Aw Vega what do you do for a living?
Don't want to act like a fool but...
How do I learn all that stuff do modify computers?
All that awesome setting up makes me feel like (╯°□°）╯︵ ┻━┻


----------



## PoopaScoopa

Are you worried at all about the Aluminum in the rad?


----------



## nvidiaftw12

Quote:


> Originally Posted by *lannshine*
> 
> Aw Vega what do you do for a living?
> Don't want to act like a fool but...
> How do I learn all that stuff do modify computers?
> All that awesome setting up makes me feel like (╯°□°）╯︵ ┻━┻


He's a military chopper pilot.


----------



## CallsignVega

Quote:


> Originally Posted by *csm725*
> 
> Which fans are those?


They are XSPC.

140mm 1350RPM Fan
- 1350RPM
- Low Noise (~29dbA)
- Airflow 73.92 CFM
- Static Air Pressure 1.23 mmAq
- Operating Voltage Range 5.5-13.8V
- 3 Pin Power Connector
- RPM Monitor Wire
- 45cm Wire
- Closed Corners
- 25mm Depth
Quote:


> Originally Posted by *Eggy88*
> 
> That is the Mo-Ra 9x140 right? What kind of results do you have with it? I'm looking to put together a bench rig and looking @ a Mo-Ra3 9x140 with 18x 140mm High speed yate Loons.


Correct. I haven't hooked it up yet so I haven't tested performance. With IB and Kepler so close I am debating if it's even worth hooking everything up on my current system. Depends on how much free time I get. 9x 140mm seem to push a lot of air. 18x 140mm would be a beast! I could always add the extra 9 fans but from the testing I've seen on this radiator it would only be 1-2C extra drop. Will see in testing.
Quote:


> Originally Posted by *lannshine*
> 
> Aw Vega what do you do for a living?
> Don't want to act like a fool but...
> How do I learn all that stuff do modify computers?
> All that awesome setting up makes me feel like (╯°□°）╯︵ ┻━┻


I've just been "mechanical" minded all of my life, repairing cars, installing upgrades on sports cars etc.
Quote:


> Originally Posted by *PoopaScoopa*
> 
> Are you worried at all about the Aluminum in the rad?


Aluminum? All of the pipe cores are copper.


----------



## Mootsfox

Toss filters on your monitor fans before you get them dusty and have issues


----------



## rdrdrdrd

this appears to be awesome


----------



## CallsignVega

Quote:


> Originally Posted by *Mootsfox*
> 
> Toss filters on your monitor fans before you get them dusty and have issues


You think that will be pretty beneficial? I have the fans sucking from inside and blowing out to the side (monitors in portrait).


----------



## PoopaScoopa

Quote:


> Originally Posted by *CallsignVega*
> 
> Aluminum? All of the pipe cores are copper.


The solder(tin,alum,zinc?) which contacts the aluminium fins can sometimes come in contact with your loop. Not sure whether this particular rad does or not though.


----------



## iCrap

Quote:


> Originally Posted by *CallsignVega*
> 
> You think that will be pretty beneficial? I have the fans sucking from inside and blowing out to the side (monitors in portrait).


Yea i would add filters.


----------



## CallsignVega

Quote:


> Originally Posted by *PoopaScoopa*
> 
> The solder(tin,alum,zinc?) which contacts the aluminium fins can sometimes come in contact with your loop. Not sure whether this particular rad does or not though.


The radiator is pretty good quality, all of the copper tubes are threaded/pressed directly into the end blocks. Shining a light into all of the connections all I see is the coated steel end blocks and the copper tubes themselves.
Quote:


> Originally Posted by *iCrap*
> 
> Yea i would add filters.


The fans blow outward so I would just be cleaning the exhaust air with filters.


----------



## tsm106

Quote:


> Originally Posted by *CallsignVega*
> 
> The radiator is pretty good quality, all of the copper tubes are threaded/pressed directly into the end blocks. Shining a light into all of the connections all I see is the coated steel end blocks and the copper tubes themselves.
> The fans blow outward so I would just be cleaning the exhaust air with filters.


You don't need filters with a datavac. You gots a datavac right Vega?


----------



## CallsignVega

Quote:


> Originally Posted by *tsm106*
> 
> You don't need filters with a datavac. You gots a datavac right Vega?


That's it, this build is over! No datavac!

Hah, I've just used compressed air cans before but this does look interesting. I just might pick one up.


----------



## iCrap

Quote:


> Originally Posted by *CallsignVega*
> 
> The radiator is pretty good quality, all of the copper tubes are threaded/pressed directly into the end blocks. Shining a light into all of the connections all I see is the coated steel end blocks and the copper tubes themselves.
> The fans blow outward so I would just be cleaning the exhaust air with filters.


Oh nevermind then, i thought you were sucking in air.


----------



## CallsignVega

Testing custom mounts/stands for the portrait FW900's. These things are harder than you think to securely portrait mount and hold 100lbs!


----------



## Nocturin

Quote:


> Originally Posted by *CallsignVega*
> 
> Testing custom mounts/stands for the portrait FW900's. These things are harder than you think to securely portrait mount and hold 100lbs!


god that's a hell of an image. I jelly.


----------



## iCrap

Looks good vega.








How many FW900s are you doing? 3 or 4? With the border-less setup i guess you actually could do surround with 4..


----------



## Skoobs

Quote:


> Originally Posted by *CallsignVega*
> 
> That's it, this build is over! No datavac!
> Hah, I've just used compressed air cans before but this does look interesting. I just might pick one up.


http://www.amazon.com/Metro-Vacuum-MDV3TCA-DataVac-Carrying/dp/B000OVUH30

it sucks, it blows, and its made in the US.


----------



## Canis-X

Quote:


> Originally Posted by *Skoobs*
> 
> http://www.amazon.com/Metro-Vacuum-MDV3TCA-DataVac-Carrying/dp/B000OVUH30
> it sucks, it blows, and its made in the US.


...it's $227


----------



## CallsignVega

Quote:


> Originally Posted by *iCrap*
> 
> Looks good vega.
> 
> 
> 
> 
> 
> 
> 
> 
> How many FW900s are you doing? 3 or 4? With the border-less setup i guess you actually could do surround with 4..


I have going on 5 FW900's but three will be three in nVIdia Surround. That gives a 1.92:1 ratio which I think is close to the perfect 2:1 ratio for "immersive" display setups.


----------



## naizarak

also, why ivy bridge? why no sb-e srx or something like that?


----------



## StormX2

Now thats pretty cool those last 2 pictures on the OP - But I am waiting for without Bezel At all

Glue Planels right to eachothe rlol


----------



## nvidiaftw12

Quote:


> Originally Posted by *CallsignVega*
> 
> Quote:
> 
> 
> 
> Originally Posted by *iCrap*
> 
> Looks good vega.
> 
> 
> 
> 
> 
> 
> 
> 
> How many FW900s are you doing? 3 or 4? With the border-less setup i guess you actually could do surround with 4..
> 
> 
> 
> I have going on 5 FW900's but three will be three in nVIdia Surround. That gives a 1.92:1 ratio which I think is close to the perfect 2:1 ratio for "immersive" display setups.
Click to expand...

So, sell me one of the others?


----------



## tsm106

Quote:


> Originally Posted by *Skoobs*
> 
> http://www.amazon.com/Metro-Vacuum-MDV3TCA-DataVac-Carrying/dp/B000OVUH30
> it sucks, it blows, and its made in the US.


Quote:


> Originally Posted by *Canis-X*
> 
> ...it's $227


LOL, wrong one. Here's the proper one.

http://www.amazon.com/gp/product/B001U899HQ/ref=oh_o00_s00_i00_details


----------



## Khaotik55

Quote:


> Originally Posted by *CallsignVega*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> You don't need filters with a datavac. You gots a datavac right Vega?
> 
> 
> 
> That's it, this build is over! No datavac!
> 
> Hah, I've just used compressed air cans before but this does look interesting. I just might pick one up.
Click to expand...

Vega, Vega, Vega....

You have a generous income, a large interest in computer setups, and you still haven't purchased yourself an air compressor?

80+ PSI is nice.


----------



## iCrap

i dont understand the datavac, i already have an air compressor, isnt it the same thing?


----------



## jbuschdev

The DataVac is like a reverse vacuum. But it doesn't work as well as canned air for really caked on stuff. It's good for a quick blast of cleaning out an entire system. If I had a computer shop I'd definitely use one over buying a bunch of canned air.

That's me, Air compressor sounds like the best option if you can filter the air and keep the pressure low enough so it doesn't damage components.


----------



## CallsignVega

Man you guys must have dusty homes! I use a top quality allergen air filter in my HVAC intake and I barely get any dust. One can of "dust off" has lasted me like 2-years!


----------



## iCrap

Quote:


> Originally Posted by *CallsignVega*
> 
> Man you guys must have dusty homes! I use a top quality allergen air filter in my HVAC intake and I barely get any dust. One can of "dust off" has lasted me like 2-years!


Yea that's why im saying i don't understand the point of the datavac. I hardly ever get dust buildup, but even if i do 10 seconds with the air compressor and its gone.


----------



## tsm106

Quote:


> Originally Posted by *iCrap*
> 
> Yea that's why im saying i don't understand the point of the datavac. I hardly ever get dust buildup, but even if i do 10 seconds with the air compressor and its gone.


I dunno how often you use your compressor but i never do so i don't keep it filled, so this waiting 10 minutes having a compressor running clacking all over the place, it's not cool.

And the datavac is stronger than a can if you use the needle nose extension. Can empties in 4mins... Datavac never. Compressor is huge, big ole hoses, loud as hell...


----------



## Billy_5110

I rarely subb to work log even if thoses are crazy because i don't really like e-mail notifications...

But i jsut can't get my eyes away from this one, i refresh each 5 mins i think to see if vega has made an update. hahahaha

This is a hell of a build.

May i ask you if you have pic of your computer set in '' another room''

If you need 25 feet of cable tor each it it msut be somehow far away.









Finally, it should be the most interisting work log ever on OCN.

Easpecially about the geo-thermal cooling.

My father was a military pilot also ( but he passed away now...) Respect to you Vega you do a beautiful work in both army and computer mod.


----------



## nvidiaftw12

Quote:


> Originally Posted by *Billy_5110*
> 
> I rarely subb to work log even if thoses are crazy because i don't really like e-mail notifications...
> 
> But i jsut can't get my eyes away from this one, i refresh each 5 mins i think to see if vega has made an update. hahahaha
> 
> This is a hell of a build.
> 
> May i ask you if you have pic of your computer set in '' another room''
> 
> If you need 25 feet of cable tor each it it msut be somehow far away.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Finally, it should be the most interisting work log ever on OCN.
> 
> Easpecially about the geo-thermal cooling.
> 
> My father was a military pilot also ( but he passed away now...) Respect to you Vega you do a beautiful work in both army and computer mod.


You don't have to subscribe to email. Just commenting does a non-email subscribe.


----------



## Op125

Awesome build Vega







Will be interesting to see the final result of this project. But I have a question about your previous setup. Because I think the triple 120Hz setup with the thin bezels looks awesome, and I am seriously considering copying that setup. But since you are doing this new project, can I assume that the bezels, even if they are really thin, will be annoying in the long run? Should I consider a single 2560x1440 display rather than a 3240x1920 eyefinity setup? I'm currently on a 1920x1080 120hz display.


----------



## iCrap

Quote:


> Originally Posted by *Op125*
> 
> Awesome build Vega
> 
> 
> 
> 
> 
> 
> 
> Will be interesting to see the final result of this project. But I have a question about your previous setup. *Because I think the triple 120Hz setup with the thin bezels looks awesome, and I am seriously considering copying that setup. But since you are doing this new project, can I assume that the bezels, even if they are really thin, will be annoying in the long run?* Should I consider a single 2560x1440 display rather than a 3240x1920 eyefinity setup? I'm currently on a 1920x1080 120hz display.


He won't have ANY bezels with this project.


----------



## Nocturin

Quote:


> Originally Posted by *iCrap*
> 
> He won't have ANY bezels with this project.


This. The freznel lenses is what drew me in to this project (not ignoring the overall bad-assed-ness of the whole thing)

And why did you change your avatar? It was awesome.


----------



## iCrap

Quote:


> Originally Posted by *Nocturin*
> 
> This. The freznel lenses is what drew me in to this project (not ignoring the overall bad-assed-ness of the whole thing)
> And why did you change your avatar? It was awesome.


Me? I got tired of it but i think i might change it back...


----------



## Op125

Quote:


> Originally Posted by *iCrap*
> 
> He won't have ANY bezels with this project.


Ye I know, but I was asking about his previous setup, with the three 120Hz screens. If it is worth getting an eyefinity setup like that because of the bezels (even if they are very thin) or if I should rather go for a single 2560x1440 display to get a high res without bezels.


----------



## crust_cheese

Quote:


> Originally Posted by *Hellish*
> 
> I loled, and the reason I said it like that is because you are asking it on the internet when google 9/10 times has the answer, but lmao report me big deal nothing is wrong with what I said to you, if anything it was too nice, ps dont pm me I dont want to hear from you.


Quote:


> Originally Posted by *Draygonn*
> 
> 1.5Gb for 3xFW900s?!? Have you seen any of Vega's previous mad scientist build threads? The guy knows what he's doing. And your response to Hellish has placed you on my ignore list.


What. I don't even get you guys. It's people like you that make the Internet and people in general unpleasant, thanks for your effort!


----------



## meeps

Quote:


> Originally Posted by *iCrap*
> 
> Me? I got tired of it but i think i might change it back...


iCrap NOOOO! your avatar was my 2nd most favorite thing about this build log (actual build being the 1st).

GROOVY BOBBING MAN MUST COME BACK

and vega, loving the love for CRTs







i definitely miss mine


----------



## Nocturin

Quote:


> Originally Posted by *iCrap*
> 
> Me? I got tired of it but i think i might change it back...


But you must, you must, you must revert! I get tired of mine every here and there, but then I remember how awesome it is








Quote:


> Originally Posted by *meeps*
> 
> iCrap NOOOO! your avatar was my 2nd most favorite thing about this build log (actual build being the 1st).
> GROOVY BOBBING MAN MUST COME BACK
> and vega, loving the love for CRTs
> 
> 
> 
> 
> 
> 
> 
> i definitely miss mine


Yes, the fact this is being done with heavy ass CRTs is just oh so AWESOMe!


----------



## CallsignVega

Quote:


> Originally Posted by *Billy_5110*
> 
> I rarely subb to work log even if thoses are crazy because i don't really like e-mail notifications...
> But i jsut can't get my eyes away from this one, i refresh each 5 mins i think to see if vega has made an update. hahahaha
> This is a hell of a build.
> May i ask you if you have pic of your computer set in '' another room''
> If you need 25 feet of cable tor each it it msut be somehow far away.
> 
> 
> 
> 
> 
> 
> 
> 
> Finally, it should be the most interisting work log ever on OCN.
> Easpecially about the geo-thermal cooling.
> My father was a military pilot also ( but he passed away now...) Respect to you Vega you do a beautiful work in both army and computer mod.


Thanks for the kind words.
Quote:


> Originally Posted by *Op125*
> 
> Awesome build Vega
> 
> 
> 
> 
> 
> 
> 
> Will be interesting to see the final result of this project. But I have a question about your previous setup. Because I think the triple 120Hz setup with the thin bezels looks awesome, and I am seriously considering copying that setup. But since you are doing this new project, can I assume that the bezels, even if they are really thin, will be annoying in the long run? Should I consider a single 2560x1440 display rather than a 3240x1920 eyefinity setup? I'm currently on a 1920x1080 120hz display.


On my previous minimalist bezel LCD setup, the gaps were so small that it didn't bother me at all. I am going with CRT's as they are king for gameplay and motion. You must remember all 2560x1440 displays have quite slow pixel response times. They have massive blur in games. So it is up to you if you want to attempt a 120Hz Eyefinity/Surround setup to keep 120Hz. I would personally use my 3x 120Hz portrait setup hands down versus a single 60Hz 2560x1440.

Monitor supports done, testing setup before working on the Fresnel lens frames/brackets. New keyboard tray mounted to desk.


----------



## armartins

Quote:


> Originally Posted by *CallsignVega*
> 
> I have going on 5 FW900's but three will be three in nVIdia Surround. That gives a 1.92:1 ratio which I think is close to the perfect 2:1 ratio for "immersive" display setups.


My actual resolution in 3x1 portrait is 3840x1920 (bezel managed 3x 1200P) exacly 2:1 and I'm really loving it. I've had another set of 3x1050P panels all of then portrait capable but never wished to try portrait, never before your triple 120hz... Well I like it so much i decided to do the bezel mod (and my monitors have yet 4.5 years of waranty left) I really don't mind to damage my 2:1 ratio, acually I really want to reduce it by reducing as much as possible the bezels in my case I'll go from 28mm to 8mm bezels. Another nice thing about portrait is that I'm liking it with the three panels without any angle so I'll just wall mount then with 3 x $12 vesa wall mount supports.

@update... are those bricks? the fresnel support will add stability to the overal structure? I know those things are heavy but this way it isn't inspiring confidence at first sight. Another thing, how you deal with the ergonomic factor of over eye line monitors. I have reduced drastically my horizontal head tilt but the vertical tilt had increased vastly. Add to that the weight of an D2000 over my head for 8h+ when I'm reading it really starts to sore...


----------



## Bit_reaper

Wow you are quite brave balancing a heavy monitor like the FW900 on only 3 bricks







I'm guessing you are working on a more permanent mounting system along with the fresnel frame. Seeing all 3 monitor on the table like that really substantiates how massive this build is. Looking forward to the first pics with the lenses on


----------



## CallsignVega

Quote:


> Originally Posted by *armartins*
> 
> My actual resolution in 3x1 portrait is 3840x1920 (bezel managed 3x 1200P) exacly 2:1 and I'm really loving it. I've had another set of 3x1050P panels all of then portrait capable but never wished to try portrait, never before your triple 120hz... Well I like it so much i decided to do the bezel mod (and my monitors have yet 4.5 years of waranty left) I really don't mind to damage my 2:1 ratio, acually I really want to reduce it by reducing as much as possible the bezels in my case I'll go from 28mm to 8mm bezels. Another nice thing about portrait is that I'm liking it with the three panels without any angle so I'll just wall mount then with 3 x $12 vesa wall mount supports.
> @update... are those bricks? the fresnel support will add stability to the overal structure? I know those things are heavy but this way it isn't inspiring confidence at first sight. Another thing, how you deal with the ergonomic factor of over eye line monitors. I have reduced drastically my horizontal head tilt but the vertical tilt had increased vastly. Add to that the weight of an D2000 over my head for 8h+ when I'm reading it really starts to sore...


I have gotten used to a more vertical scan with the monitor being over-eye-line. Although I still have the center point of the monitor far below by eye-line for the more natural slightly downward gaze that human eyes prefer. When I am reading web pages or playing a game, I keep what I am reading or the center cross-hair in that easier to view slightly below eye-level position.

On another note: the setup is very sturdy and secure. When you have almost 100 pounds sunken down into rubber isolation feet on each set of mounts with the center of gravity far behind the front two supports. It would take literally someone running into it to knock them over, which would apply to any display setup. Let's just say it would take many more times the force to knock one or more of these displays over versus a regular LCD and OEM stand sitting on the desk. It's just one of those things that pictures don't do justice. If you were to view the setup in person and push/pull the monitors, you would see how sturdy they are. Come on guys you think I am ******ed enough to have almost $4000 in FW900's just blow over in a slight breeze?









Also, the picture is playing tricks on straight lines in regards to the desk. The desk measured with a T-square is completely flat. This stainless steel desk is far stronger than I could hope for. I think I could fit another three FW900's weight on here without a problem.


----------



## d33r

Can you please tell me what lcd monitors are used in the picture with the blue owl desktop background. and how much modding did you need to do to get them like that. thanks


----------



## CallsignVega

Quote:


> Originally Posted by *d33r*
> 
> Can you please tell me what lcd monitors are used in the picture with the blue owl desktop background. and how much modding did you need to do to get them like that. thanks


http://www.overclock.net/t/1143724/3x-120hz-eyefinity-portrait-setup


----------



## l88bastar

Quote:


> Originally Posted by *CallsignVega*
> 
> I have gotten used to a more vertical scan with the monitor being over-eye-line. Although I still have the center point of the monitor far below by eye-line for the more natural slightly downward gaze that human eyes prefer. When I am reading web pages or playing a game, I keep what I am reading or the center cross-hair in that easier to view slightly below eye-level position.
> On another note: the setup is very sturdy and secure. When you have almost 100 pounds sunken down into rubber isolation feet on each set of mounts with the center of gravity far behind the front two supports. It would take literally someone running into it to knock them over, which would apply to any display setup. Let's just say it would take many more times the force to knock one or more of these displays over versus a regular LCD and OEM stand sitting on the desk. It's just one of those things that pictures don't do justice. If you were to view the setup in person and push/pull the monitors, you would see how sturdy they are. Come on guys you think I am ******ed enough to have almost $4000 in FW900's just blow over in a slight breeze?
> 
> 
> 
> 
> 
> 
> 
> 
> Also, the picture is playing tricks on straight lines in regards to the desk. The desk measured with a T-square is completely flat. This stainless steel desk is far stronger than I could hope for. I think I could fit another three FW900's weight on here without a problem.


Hahaha..heres a pic of Vega playin on his setup
http://www.esperino.com/wp-content/uploads/2012/01/Neck-Stretch-BF3.jpg
http://www.youtube.com/watch?feature=fvwp&NR=1&v=u6l4dAHKL-E

Vega bricks are for noobsl! I know I showed you before, but you should have gone with an exotic water mount setup like I did with my portrait a750s. Everyone knows that water is the strongest force on earth








http://img.photobucket.com/albums/v725/l88bastard/Thepowerofwaterandleverage.jpg
http://img.photobucket.com/albums/v725/l88bastard/crush2.jpg


----------



## CallsignVega

Quote:


> Originally Posted by *l88bastar*
> 
> Hahaha..heres a pic of Vega playin on his setup
> http://www.esperino.com/wp-content/uploads/2012/01/Neck-Stretch-BF3.jpg
> Vega bricks are for noobsl! I know I showed you before, but you should have gone with an exotic water mount setup like I did with my portrait a750s. Everyone knows that water is the strongest force on earth
> 
> 
> 
> 
> 
> 
> 
> 
> http://img.photobucket.com/albums/v725/l88bastard/Thepowerofwaterandleverage.jpg
> http://img.photobucket.com/albums/v725/l88bastard/crush2.jpg


Haha those aren't just ordinary bricks. They are topped with a special NASA developed synthetic Acrylonitrile Butadiene Carboxy Monomer specially developed to secure in-place and withstand the intense pressure associated with the FW900.


----------



## Hydroplane

This is coming along nicely







Vega you should post a pic of your rig, I'm curious to know what you're using to keep that 2700K cool at 5.2 GHz. Air? Water? Disassembled air conditioner?


----------



## l88bastar

Quote:


> Originally Posted by *CallsignVega*
> 
> Haha those aren't just ordinary bricks. They are topped with a special NASA developed synthetic Acrylonitrile Butadiene Carboxy Monomer specially developed to secure in-place and withstand the intense pressure associated with the FW900.


Checkmate! Well then I have Water, you have Earth....so whos got fire?!?

Lmao and I wanted to get this video in my last post before it was possibly quoted....lol its my favorite BF3 Vid
http://www.youtube.com/watch?feature=fvwp&NR=1&v=u6l4dAHKL-E


----------



## Blizlake

Quote:


> Originally Posted by *CallsignVega*
> 
> Haha those aren't just ordinary bricks. They are topped with a special NASA developed synthetic Acrylonitrile Butadiene Carboxy Monomer specially developed to secure in-place and withstand the intense pressure associated with the FW900.


A.k.a "rubber"

Cool build man, digging the FW900's


----------



## iCrap

Looks good vega.








Im not so sure about those bricks though.


----------



## CallsignVega

Quote:


> Originally Posted by *iCrap*
> 
> Looks good vega.
> 
> 
> 
> 
> 
> 
> 
> 
> Im not so sure about those bricks though.


You know what they say in Soviet Russia...


----------



## TheBadBull

in soviet russia, good looks vega.
did i do it right?


----------



## CallsignVega

Quote:


> Originally Posted by *TheBadBull*
> 
> in soviet russia, good looks vega.
> did i do it right?


Haha. I think I know where I need to send my computer to get all of the coolant removed when it's time for a cleaning:


----------



## CallsignVega

I've got the fans all wired up and mounted. I had to switch from an under-mount setup that I had before as it put the fan's electric motor too close to the electron guns and it introduced a very faint wobble to the picture. So now they are mounted a good 2.5 inches further and the picture is crystal clear.










Testing the setup with the Skyrim AFK camera spin that is very good for testing for motion problems (blur/ghosting/stuttering etc).

http://www.youtube.com/watch_popup?v=FScNdwuIm64&feature=youtu.be&vq=hd720

The motion, clarity and picture quality is far higher than my previous LCD setups. This is going to make for epic gaming! At 4320x2304 @ 80Hz, it has become very apparent that I need some upgraded video cards. Especially trying to keep a minimum of 80 FPS under all circumstances pushing 10-megapixel's. Every single game I ran required more than 2GB of memory, so it looks like I will have to hold off on the vanilla 2GB GTX 680's and get 4GB FTX 680's for 4-way SLI. Almost positive I will be pairing a 3770K up with Gigabyte's Sniper 3 Z77 motherboard that has confirmed 4-way SLI support.



Tomorrow the Fresnel lens frame build begins.


----------



## csm725

You think there's any chance you find a nicer looking mobo?








Looking awesome as usual though


----------



## rafety58

This build is looking great, I would love to have some epic crt's to game on. But I'm stuck with my crappy lcd that can only do 75Hz at 1080P


----------



## l88bastar




----------



## dizzy4

Quote:


> Originally Posted by *CallsignVega*
> 
> Here is my proposed simple cooling system. Let me know if you guys can see any way to make it more efficient.


All in all this is an awesome build! However, I am skeptical about your cooling solutions. Well, just the phase change to be exact. It looks like you are planning to just flip a switch and pump coolant to already installed waterblocks and watercooled parts. This really isn't possible with phase change, so I would reconsider your design and make the phase change unit into a chiller that can be turned on or off. You will most likely not be able to reach sub-zero temps with the setup that way, but it would still be awesome.
Quote:


> Originally Posted by *CallsignVega*
> 
> Here I have started the Geo-thermal drilling. This is the top 3 feet (17 feet underground). I've drilled near the home slab foundation and in a year-round shaded area to maximize Earth temperature stability. So far it looks like the ground water level is 5-6 feet below the surface which is GREAT. Finally a positive about living in the South.


Oh but I do love this idea







That is really cool

Good luck with the lenses tomorrow!


----------



## CallsignVega

Quote:


> Originally Posted by *csm725*
> 
> You think there's any chance you find a nicer looking mobo?
> 
> 
> 
> 
> 
> 
> 
> 
> Looking awesome as usual though


I actually like the look! A bit different than the typical red/black. Although my motherboards are usually in the bowels of some contraption never seen by mortals.








Quote:


> Originally Posted by *l88bastar*


What have you been feeding your Chihuahua?!
Quote:


> Originally Posted by *dizzy4*
> 
> All in all this is an awesome build! However, I am skeptical about your cooling solutions. Well, just the phase change to be exact. It looks like you are planning to just flip a switch and pump coolant to already installed waterblocks and watercooled parts. This really isn't possible with phase change, so I would reconsider your design and make the phase change unit into a chiller that can be turned on or off. You will most likely not be able to reach sub-zero temps with the setup that way, but it would still be awesome.
> Oh but I do love this idea
> 
> 
> 
> 
> 
> 
> 
> That is really cool
> Good luck with the lenses tomorrow!


Eh? The phase change unit has been completed and running for a year (I am just adapting it to work with this new setup also):

http://www.overclock.net/t/950313/vegas-quad-sli-3gb-gtx580-sub-zero-liquid-cooled-build



What do you guys think, 4x 4GB RAM or 2x 4GB RAM? I wonder with IB if 2x sticks (easier on memory controller) will still be able to overclock higher than 4x sticks as with CPU's of the past. 16GB may be overkill as with 8GB with my current system I don't use it up for gaming.


----------



## dizzy4

Quote:


> Originally Posted by *CallsignVega*
> 
> I actually like the look! A bit different than the typical red/black. Although my motherboards are usually in the bowels of some contraption never seen by mortals.
> 
> 
> 
> 
> 
> 
> 
> 
> What have you been feeding your Chihuahua?!
> Eh? The phase change unit has been completed and running for a year (I am just adapting it to work with this new setup also):
> http://www.overclock.net/t/950313/vegas-quad-sli-3gb-gtx580-sub-zero-liquid-cooled-build


Ah well that makes sense. You will be using the unit to cool a tank of coolant that is capable of sub-zero temperatures. Then it could go through the liquid cooling system just fine. So in effect, it is a large chiller unit. I thought you were attempting a direct die phase change cooling setup to integrate.


----------



## csm725

I didn't need to add more volts anywhere on my SB rig (sig) going from 2x2 to 4x4, just keep that in mind.


----------



## nvidiaftw12

Vega, here is what I would do. I would first put all the coolant thorough the mora 3. That would get it almost to room temperature. Then put it through the geo-thermal part. Then put it through the phase change. That way by the time it hits phase it will already be below room temp and the phase can take it way below zero.


----------



## CallsignVega

During Fresnel lens fitting I found that my estimations were off a bit. With the magnification I want and seem-less borders, I need to get the monitors over an inch closer to each other. This build just got a lot more difficult! I've already removed the FW900's swivel mounts but it looks like I will have to remove the entire base supporting structure. Whether I have to remove the picture tube from the chassis to accomplish that is still up in the air.

Par for the course. Usually the difficult projects are the only ones worth doing in the first place!

My initial testing results in BF3 are phenomenal.


----------



## Nocturin

What about taking a dremel to the "bottom" of the monitors. There's got to be atleast another half inch or more clearing the chassis.

are you going to build something to mount the monitors or keep them on the bricks?


----------



## l88bastar

Quote:


> Originally Posted by *CallsignVega*
> 
> During Fresnel lens fitting I found that my estimations were off a bit. With the magnification I want and seem-less borders, I need to get the monitors over an inch closer to each other. This build just got a lot more difficult! I've already removed the FW900's swivel mounts but it looks like I will have to remove the entire base supporting structure. Whether I have to remove the picture tube from the chassis to accomplish that is still up in the air.
> Par for the course. Usually the difficult projects are the only ones worth doing in the first place!
> My initial testing results in BF3 are phenomenal.


Mmmmmmmmaybe you might wanna try a larger custom fresnel from that company I PMd you. If they can get the focal points & grooves similar to the Kanteks but on a larger panel you may not have to play doctor with your FW900s. Id rather screw with a new set of glasses before brain surgery to cure near sightedness!


----------



## AMD_Freak

If you ever get one built and keep it more then a week Im going to come up and see your setup your only a few miles away, You seem to be rebuilding every 2 weeks







maybe you could fly over, I see the choppers flying over all the time.


----------



## CallsignVega

Start with this:










And about 13 hours of work end up with this:










Complete lower platform on all FW900's removed, internal wires re-routed and modified, brackets cut, OSD control modules and power switches relocated and mounted on the top. It is impossible to get the FW900's any closer, the metal CRT casings are physically touching one another.










I went from a 4.4 inch gap between each lit image to a 2.75 inch gap. That is a smaller gap than some multi-LCD setups! Now I think I can finally start working on the Fresnel mounts.


----------



## l88bastar

Very, very impressive!


----------



## Bit_reaper

Now that looks incredible. I can't believe how close together you managed to get them. A massive improvement over the previous setup


----------



## CallsignVega

One more shot of the mod's:


----------



## Nitrogannex

Dear God......Sub'd


----------



## CallsignVega

Quote:


> Originally Posted by *nvidiaftw12*
> 
> Vega, here is what I would do. I would first put all the coolant thorough the mora 3. That would get it almost to room temperature. Then put it through the geo-thermal part. Then put it through the phase change. That way by the time it hits phase it will already be below room temp and the phase can take it way below zero.


The phase change unit is very powerful! On my 4-Way 580 build it was able to pull temperatures down to minus 27 C during testing in only a few minutes.


----------



## nvidiaftw12

Quote:


> Originally Posted by *CallsignVega*
> 
> The phase change unit is very powerful! On my 4-Way 580 build it was able to pull temperatures down to minus 27 C during testing in only a few minutes.


Yes, but it seems kinda pointless to use one of the three cooling systems when you could use all of them. Besides that could take a load of the phase and then you wouldn't have such a high electricity bill.


----------



## CallsignVega

Quote:


> Originally Posted by *nvidiaftw12*
> 
> Yes, but it seems kinda pointless to use one of the three cooling systems when you could use all of them. Besides that could take a load of the phase and then you wouldn't have such a high electricity bill.


What you propose doesn't make any sense. The temperature of the coolant coming out of the computer even after the heat dump of the system into the loop is only a slightly higher temperature than when it went in. In essence I would be sub-zero cooling the ground and turning my ambient radiator into an air-conditioning unit if I linked them all together.

The main loop (geo-thermal) is awesome because it takes all of the heat from the computer, dissipates it into the ground and I get 11-14C chilled water for virtually zero energy use. The phase-change has always been just a "benchmark" run type side of the loop.


----------



## iCrap

Nice job on making the border smaller









Also, sort of off topic but where do you find wallpapers that are big enough for multi display? im having a very hard time.... would you mind posting that blue one (and any others if you have them)


----------



## CallsignVega

View the video in 720P. Resolution 4320x2304, 80Hz of CRT smoothness. Initial placement testing of the lenses (not final position and bracketing). If you sit just right in your normal viewing position, both bezel gaps can get as low as 1-2mm in width. Now if you do move your head a lot the gaps will change a bit.

Side benefits are a larger image than three displays without the lenses. Also a slight "depth" effect is added to the image which I quite like. With the lenses magnification and sitting at the designed eye-point, the display setup is around 50 inches running at 10-megapixel and 80Hz. It is hard for a camera to capture some of what I am describing accurately. Camera hand is also not too steady!

http://www.youtube.com/watch_popup?v=CBj3RfMGOV0&feature=youtu.be&vq=hd720


----------



## CallsignVega

Playing around a little more in Skyrim. Besides the awesome resolution and unparalleled CRT motion, the lenses add a little depth effect and it feels like I could craw through the lenses and into the Skyrim world. The camera does not do it justice!

http://www.youtube.com/watch_popup?v=tFunVKsoFH8&feature=youtu.be&vq=hd720


----------



## l88bastar




----------



## Nocturin

That is amazing. I can't wait to see a high res photo in the perfect place.

How far do you have to sit in front of the screen? If seems like your close enough to really create some immersion.


----------



## goodtobeking

OCN is full of these "run of the mill" computer builds, wish someone would do something exceptional for a change /sarcasm

Amazing, is the only work to truly describe this. Love the lenses, they are something very interesting IMO Now if we could buy them already customized for our monitors...


----------



## CallsignVega

Quote:


> Originally Posted by *Nocturin*
> 
> That is amazing. I can't wait to see a high res photo in the perfect place.
> How far do you have to sit in front of the screen? If seems like your close enough to really create some immersion.


I would need some sort of professional camera with a wide-angle lens. My regular 720P camera cannot do pictures/video justice with this setup! My eyes are about 30" from the actual CRT surfaces and about 22" from the surface of the lenses. The gap between the CRT and lenses are about 8" deep. The image fills up my vision quite nicely for that "immersive feel" and that great near 2:1 aspect ratio.

The image is so large and motion is so good, bouncing around in the tank on rough surfaces while driving in BF3 was getting me a little motion sick ahah. Love it.


----------



## Citra

One word. BEAST.


----------



## dtfgator

Quote:


> Originally Posted by *CallsignVega*
> 
> The image is so large and motion is so good, bouncing around in the tank on rough surfaces while driving in BF3 was getting me a little motion sick ahah. Love it.


A motion sick pilot? Damn, those screens must feel real.


----------



## Smo

I gotta say Vega, that looks pretty incredible fella! Well done for thinking outside the box.


----------



## Nocturin

Quote:


> Originally Posted by *CallsignVega*
> 
> I would need some sort of professional camera with a wide-angle lens. My regular 720P camera cannot do pictures/video justice with this setup! My eyes are about 30" from the actual CRT surfaces and about 22" from the surface of the lenses. The gap between the CRT and lenses are about 8" deep. The image fills up my vision quite nicely for that "immersive feel" and that great near 2:1 aspect ratio.
> The image is so large and motion is so good, bouncing around in the tank on rough surfaces while driving in BF3 was getting me a little motion sick ahah. Love it.


That is incredible. I jelly, I jelly hard,

What resources are you using for the fresnel lenses? Is there math involved to get the "no bezel" effect? Are they super expensive?


----------



## CallsignVega

Quote:


> Originally Posted by *Nocturin*
> 
> That is incredible. I jelly, I jelly hard,
> What resources are you using for the fresnel lenses? Is there math involved to get the "no bezel" effect? Are they super expensive?


The lenses aren't too bad in price, $110 each:

http://www.amazon.com/Kantek-Monitor-Magnifier-Widescreen-MAG19WL/dp/B002JGKI6O/ref=sr_1_1?ie=UTF8&qid=1332359605&sr=8-1

There are some numbers involved in focal distance, magnification, aspect ratio (especially for the center Fresnel), size of the displayed image, angle of the image(s), and a lot of re-positioning and trial and trial/error.


----------



## CallsignVega

4x EVGA GTX 680's inbound.


----------



## atmosfar

It would be great if you could do like a step-by-step zoom in set of photographs so we can see what the entire setup looks like with the Fresnel but also what it looks like in the seated position. You mentioned that your lens isn't wide enough - perhaps a tripod and some image stitching would solve that?


----------



## Hydroplane

Quote:


> Originally Posted by *CallsignVega*
> 
> 4x EVGA GTX 680's inbound.


Will 2GB be enough?


----------



## Mike-IRL

Quote:


> Originally Posted by *Hydroplane*
> 
> Will 2GB be enough?


For now I'm guessing.








4GB cards aren't out yet.


----------



## Hydroplane

Quote:


> Originally Posted by *Mike-IRL*
> 
> For now I'm guessing.
> 
> 
> 
> 
> 
> 
> 
> 
> 4GB cards aren't out yet.


According to here they'll be out very, very soon.


----------



## CallsignVega

Quote:


> Originally Posted by *atmosfar*
> 
> It would be great if you could do like a step-by-step zoom in set of photographs so we can see what the entire setup looks like with the Fresnel but also what it looks like in the seated position. You mentioned that your lens isn't wide enough - perhaps a tripod and some image stitching would solve that?


Hm interesting. I might just do some photo stitching to get a proper perspective.
Quote:


> Originally Posted by *Hydroplane*
> 
> According to here they'll be out very, very soon.


According to Jacob from EVGA, around ~2 months for 4GB cards.

In Surround tests the 2GB limit doesn't seem to be hurting the 680, even in games with high-res texture packs:











I will test them versus my 3GB 580's and see if the VRAM is crippling.


----------



## Cavi Mike

This is pretty awesome. You should take a video with a fish-eye lens.


----------



## Smo

Wow - are those graphs stock for stock (aka 7970 at 925MHz and 680 at 1058MHz)?

If so, the cards are pretty much identical (despite that when overclocked the 7970 looks like it'll edge forward slightly).

Going to stick with AMD this time around I think.


----------



## wongwarren

Quote:


> Originally Posted by *Smo*
> 
> Wow - are those graphs stock for stock (aka 7970 at 925MHz and 680 at 1058MHz)?
> If so, the cards are pretty much identical (despite that when overclocked the 7970 looks like it'll edge forward slightly).
> Going to stick with AMD this time around I think.


680 can be overclocked too.


----------



## Smo

Quote:


> Originally Posted by *wongwarren*
> 
> 680 can be overclocked too.


Yeah I'm well aware, but if you look logically at the graphs we've seen, the 7970 is 100MHz behind and achieving the same results, so it would seem appropriate that a 1200MHz 7970 could perform the same as a 1300MHz 680.

All I'm saying is for what I've got, swapping cards isn't really justified. I'm not trying to put down the 680!


----------



## Faster_is_better

lolwut...

Stop making NASA jealous (and the rest of OCN).


----------



## Hydroplane

Even if you run out of VRAM you could just drop the resolution to 1200x1920 per screen instead of 1440x2304 and up the refresh rate







plus CRTs have a "smoothing" effect so you could use less AA.


----------



## PoopaScoopa

And kill Windows Aero. It frequently eats of 500MB of VRAM. Strange how 7970 performs the same as the 680 when MSAA is not in use. AMD has always been bad at that. The [H] Skyrim test looks faulty compared to HardwareCanucks' test.


----------



## CallsignVega

Quote:


> Originally Posted by *Smo*
> 
> Yeah I'm well aware, but if you look logically at the graphs we've seen, the 7970 is 100MHz behind and achieving the same results, so it would seem appropriate that a 1200MHz 7970 could perform the same as a 1300MHz 680.
> All I'm saying is for what I've got, swapping cards isn't really justified. I'm not trying to put down the 680!


I've read a lot of chars and while the 7970 is a great card and closes the gap some at higher resolutions, I am still going to give the edge slightly to the 680's. The 680's should also be great over-clockers, they have almost 1900MHz under LN2. Under my chilled water they should be beasts!
Quote:


> Originally Posted by *Faster_is_better*
> 
> lolwut...
> Stop making NASA jealous (and the rest of OCN).











Quote:


> Originally Posted by *Hydroplane*
> 
> Even if you run out of VRAM you could just drop the resolution to 1200x1920 per screen instead of 1440x2304 and up the refresh rate
> 
> 
> 
> 
> 
> 
> 
> plus CRTs have a "smoothing" effect so you could use less AA.


Quote:


> Originally Posted by *PoopaScoopa*
> 
> And kill Windows Aero. It frequently eats of 500MB of VRAM. Strange how 7970 performs the same as the 680 when MSAA is not in use. AMD has always been bad at that. The [H] Skyrim test looks faulty compared to HardwareCanucks' test.


Yes, the flexibility of the CRT Surround resolution/Hz is great. Plus like you said CRT's don' t need nearly as much AA. I used to adjust the Aero when game launched but stopped doing that with the 3GB cards. I may try it with the 680's depending on the 680's benchmarks on my setup versus the 3GB 580's.


----------



## Smo

Quote:


> Originally Posted by *CallsignVega*
> 
> I've read a lot of chars and while the 7970 is a great card and closes the gap some at higher resolutions, I am still going to give the edge slightly to the 680's. The 680's should also be great over-clockers, they have almost 1900MHz under LN2. Under my chilled water they should be beasts!


Yeah go for it dude - I'm not trying to put the 680 down at all! I'm just being realistic about whether or not I feel it's worthwhile selling the 7970s for, and I can't justify it, that's all


----------



## csm725

Smo, I would stick with your 7970s.


----------



## PoopaScoopa

You need one of these to get around the low OCP that you can't disable in the 680:

http://lab501.ro/placi-video/nvidia-geforce-gtx680-partea-ii-studiu-de-overclocking
































http://www.evga.com/products/moreInfo.asp?pn=100-UT-0400-BR&family=Accessories%20-%20Hardware&sw=4
EPower tips: http://kingpincooling.com/forum/showthread.php?t=1569

I'd love to see how you implement 4 of these into your system, haha.


----------



## Lovidore

Quote:


> Originally Posted by *CallsignVega*
> 
> 4x EVGA GTX 680's inbound.


Dammit, Vega. You just made me accidentally my keyboard again.


----------



## SpartanVXL

Hot damn... you put my sig Philips to shame.

I would love to get even just one of those FW900's but here in overpriced NZ they don't do CRT's anymore


----------



## Nocturin

Quote:


> Originally Posted by *d3_deeb*
> 
> Dammit, Vega. You just made me accidentally my keyboard again.


This.

Can't wait for your analysis of your quad-sli.


----------



## Ghooble

Today was the first time where I actually watched a video of your setup in action and I must say Vega. Your shiz is







worthy. Very good work


----------



## CallsignVega

Quote:


> Originally Posted by *Smo*
> 
> Yeah go for it dude - I'm not trying to put the 680 down at all! I'm just being realistic about whether or not I feel it's worthwhile selling the 7970s for, and I can't justify it, that's all


No reason, stick to the 7970's they are great cards. The largest reason I went with nVidia is only they can do 3x CRT's and I find SLI just a tad smoother than X-fire in Eyefinity/Surround.
Quote:


> Originally Posted by *PoopaScoopa*
> 
> You need one of these to get around the low OCP that you can't disable in the 680:
> http://lab501.ro/placi-video/nvidia-geforce-gtx680-partea-ii-studiu-de-overclocking
> http://www.evga.com/products/moreInfo.asp?pn=100-UT-0400-BR&family=Accessories%20-%20Hardware&sw=4
> EPower tips: http://kingpincooling.com/forum/showthread.php?t=1569
> I'd love to see how you implement 4 of these into your system, haha.


LOL *** is all that. Would be funny to go through all that and you get like 10 more MHz on your overclock.
Quote:


> Originally Posted by *Ghooble*
> 
> Today was the first time where I actually watched a video of your setup in action and I must say Vega. Your shiz is
> 
> 
> 
> 
> 
> 
> 
> worthy. Very good work


Thanks!


----------



## hglazm

Where'd you get the Fresnel lenses from? I've been trying to track down a 17" for a cheap projector I want to slap together from an old LCD but I cant find any suppliers.


----------



## gibsy

Nice build vega!








I wish I will have the time, money and skill as you are and do the job..








By the way, 1 thing i wanna add..It would be better if you can build a custom desk with custom shelf that can cover your lenses as well as the CRT..just want to make a clean finish..


----------



## Onions

o my word


----------



## stu.

I wanted to post something witty and clever after going through this entire thread.

But... I can't think of anything more rewarding than staring at these images. Again.


----------



## Nocturin

Quote:


> Originally Posted by *hglazm*
> 
> Where'd you get the Fresnel lenses from? I've been trying to track down a 17" for a cheap projector I want to slap together from an old LCD but I cant find any suppliers.


post # 195 of this thread







.


----------



## CallsignVega

Quote:


> Originally Posted by *gibsy*
> 
> Nice build vega!
> 
> 
> 
> 
> 
> 
> 
> 
> I wish I will have the time, money and skill as you are and do the job..
> 
> 
> 
> 
> 
> 
> 
> 
> By the way, 1 thing i wanna add..It would be better if you can build a custom desk with custom shelf that can cover your lenses as well as the CRT..just want to make a clean finish..


The lens setup is getting its own housing. The aft portion of the FW900's will stay exposed, I like the industrial aluminum look.


----------



## CallsignVega

Two of my 4 GTX 680's came in. Hopefully they run my 3x FW900's without a hitch.










It is amazing how small and light these cards are. The 7970 isn't that large of a card (quite a bit smaller than the 6990's), but for nVidia to get more performance than a 7970 out of such a small package is mighty impressive.


----------



## Onions

nice man i started following you when you did that sub zero quad (480's?) and i must say im impressed







good work


----------



## PoopaScoopa

Quote:


> Originally Posted by *CallsignVega*
> 
> LOL *** is all that. Would be funny to go through all that and you get like 10 more MHz on your overclock.


That's how Kingpin got 1900Mhz with LN2, by replacing the onboard VRMs with the EVGA EPower. The very low OCP headroom on the reference 680's weak VRMs limit any serious overclocks. Especially for your chilled WCing. Unless you wait a couple months for the Lightning/Classified non-ref models.


----------



## Billy_5110

i want a pic of your computer as it's hidden in another room. and how you root your cable from point A to B?

PPPPPLLLLLEEEEEAAAAASSSSSEEEEE










Nice gxt 680 by the way. holy crap thoses are small!! but it's normal they were supposed to be mid-range card at the beginning.


----------



## Lovidore

Quote:


> Originally Posted by *CallsignVega*
> 
> Two of my 4 GTX 680's came in. Hopefully they run my 3x FW900's without a hitch.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> It is amazing how small and light these cards are. The 7970 isn't that large of a card (quite a bit smaller than the 6990's), but for nVidia to get more performance than a 7970 out of such a small package is mighty impressive.


What water blocks do you plan to get with these cards? Ek already has one out

Sent from GT-I9100 using Tapatalk


----------



## Canis-X




----------



## Hydroplane

Wow EVGA got those to you fast, they only came out yesterday lol


----------



## CallsignVega

Well, I found out that in nVidia's infinite wisdom they replaced the dual DVI-I slots found on GTX 580's with a single DVI-I and one DVI-D so each GTX 680 can only output to one analog monitor.

So I need a minimum of three 680's to run my setup which is normally fine, but I only have two 680's at the moment (two others inbound) but only a Z68 MB that can do two card SLI. So I have 680's and I basically cannot use them until Z77 comes out or I just use one monitor. I am tossing around the idea of getting a 3-way SLI capable Z68 from Amazon for temp use and just return it when the Gigabyte G1.Sniper 3 Z77 launches.

I had 3GB 580 benchmarks done and everything. Blasted! Right now 680's are making good paper weights.







Maybe I will load up some games on my FW900's at 2560x1600 and crank everything to max and do some benchmarks/memory testing until I get the Surround monitor setup to 680's sorted.


----------



## l88bastar

Uggggg

Well there goes my plan of running three CRTs with two 680 gtxs in my mini-atx vulcan case









I dont know of any 1155 mobos that can handle three or four gpus..perhaps you can "rent" a 2011 mobo/cpu for a couple of weeks


----------



## Hydroplane

Why couldn't you use a displayport to vga or hdmi to vga?


----------



## CallsignVega

Quote:


> Originally Posted by *Hydroplane*
> 
> Why couldn't you use a displayport to vga or hdmi to vga?


Doesn't work that way. DP and HDMI are digital only.

Delete post, I changed out the SLI connector with a new one and its working properly now. Yay for random failures!


----------



## Hydroplane

Quote:


> Originally Posted by *CallsignVega*
> 
> Doesn't work that way. DP and HDMI are digital only.


Ah I see. I was looking at the adapters available and thinking "why wouldn't that work?" but after reading some more about it you'd need to convert the digital signal into analog.

Do you have another monitor you could test it with? maybe an lcd with both analog and digital inputs so you could narrow down the problem?

Edit: Actually since it works in windowed mode it seems like a driver problem.


----------



## l88bastar

So did you get sli single screen working....or did you get sli surround screen working?

Oh and I found footage of AMD & Nvidias driver team working out together


----------



## CallsignVega

There will be plenty of Z77 mobo's that are going to launch here that you can run 3 and 4-way SLI on. But in order to run three CRT's you need a minimum of three GTX 680's. Hence why I need to get a temp MB to get my Surround setup working until the Z77 boards launch.


----------



## nvidiaftw12

Quote:


> Originally Posted by *l88bastar*
> 
> So did you get sli single screen working....or did you get sli surround screen working?
> Oh and I found footage of AMD & Nvidias driver team working out together


Itunes and windows right here.


----------



## l88bastar

Quote:


> Originally Posted by *nvidiaftw12*
> 
> Itunes and windows right here.


----------



## TheBadBull




----------



## Willanhanyard

Looool. So why do you choose the CRT's? Since it seems like you have an everlasting pocket of money, why not just go triple 4K displays?


----------



## nvidiaftw12

Quote:


> Originally Posted by *Willanhanyard*
> 
> Looool. So why do you choose the CRT's? Since it seems like you have an everlasting pocket of money, why not just go triple 4K displays?


Refresh rate, no input lag, looks cool and oldschool.


----------



## rdrdrdrd

Quote:


> Originally Posted by *Willanhanyard*
> 
> Looool. So why do you choose the CRT's? Since it seems like you have an everlasting pocket of money, why not just go triple 4K displays?


better colors and smaller pixels


----------



## goodtobeking

Could you use a PCIe Cable Riser?? And mod a mount to run it over the rest, or do you not have anymore PCIe x16 express lanes??


----------



## zdude

one word

WOW


----------



## CallsignVega

Quote:


> Originally Posted by *rdrdrdrd*
> 
> better colors and smaller pixels


Not to mention 4K LCD's will have all the pitfalls of current LCD's.
Quote:


> Originally Posted by *goodtobeking*
> 
> Could you use a PCIe Cable Riser?? And mod a mount to run it over the rest, or do you not have anymore PCIe x16 express lanes??


Ya, I am out of 16x slots.







Z77 cannot come fast enough!


----------



## l88bastar

Damn according to this review SLI scaling of the 680s is crap








http://translate.google.com/translate?sl=sv&tl=en&js=n&prev=_t&hl=en&ie=UTF-8&layout=2&eotf=1&u=http%3A%2F%2Fwww.sweclockers.com%2Frecension%2F15196-geforce-gtx-680-kepler-samt-sli%2F21%23pagehead&act=url

But then again I found crossfire scaling of the 7970s to be horrid due to micro-stutter....so if you get 40% out of the second card in SLI without microstutter, then Nvidia is still the one to choose.


----------



## rdrdrdrd

scaling is one of the first things that improves with new drivers, NVIDIA will probably clean this up and use he raw power of the 680 to the fullest by the time Vega gets his Z77 board


----------



## TA4K

This may be a stupid suggestion, but when someone said using an adapter you said it wouldn't work. What if you used an active adapter? IIRC they can properly convert to analog signal. Maybe that might work.


----------



## CallsignVega

Quote:


> Originally Posted by *l88bastar*
> 
> Damn according to this review SLI scaling of the 680s is crap
> 
> 
> 
> 
> 
> 
> 
> 
> http://translate.google.com/translate?sl=sv&tl=en&js=n&prev=_t&hl=en&ie=UTF-8&layout=2&eotf=1&u=http%3A%2F%2Fwww.sweclockers.com%2Frecension%2F15196-geforce-gtx-680-kepler-samt-sli%2F21%23pagehead&act=url
> But then again I found crossfire scaling of the 7970s to be horrid due to micro-stutter....so if you get 40% out of the second card in SLI without microstutter, then Nvidia is still the one to choose.


There seems to be a lot of reviews that contradict the one you posted. Although, the 7970 does close a large portion of the gap:



Quote:


> Originally Posted by *TA4K*
> 
> This may be a stupid suggestion, but when someone said using an adapter you said it wouldn't work. What if you used an active adapter? IIRC they can properly convert to analog signal. Maybe that might work.


I cannot find an active adapter that has faster than a 162MHz DAC. That only allows a max of 1920x1200 @ 60Hz, a no-go for the FW900. If you find an active adapter with a faster DAC, please let me know.


----------



## jimba86

Whoa, this is awsome vega!

great work!

so with you saying that you cant run multiple analog monitors off the GTX 680 i guess this defeats what i want to do by running a u2711+ 17" VGA monitor.. from one card

and what hardware are going be using in this build?


----------



## CallsignVega

Got bored since my third 680 isn't in to run my 3x FW900 setup so I tested one FW900 at 2560x1600 with each applicable game having all settings maxed.










The most VRAM use I saw in BF3 was 1930 MB, Crysis 2 was 1971 MB, and the most in Skyrim was 2028 MB, all with no slow-downs with the 680's. The other games were well under 2GB usage.

I am not sure if the limit on the 8x/8x PCI-E 2.0 slots were hurting the 680's more than the 3GB 580's but the 680's in SLI only ended up an average of 24% faster than the 580's. A bit lower than I was expecting.

There was an anomaly that I reproduced having over 100% scaling in 680 SLI Skyrim. If you include that result, SLI scaling is a perfect 100%. If you exclude that result and go off the other four games, the SLI scaling is 90%. Still pretty darn good. As for the reason Skyrim was scaling so incredible in SLI, might be a driver issue with single card as in both instances the GPU's were at max utilization.


----------



## Blizlake

That's some major GPU power







But really, the metro2033 results made me giggle a little, didnt know it required that much juice


----------



## Lovidore

Hm.. So you expect that Quad-SLI scaling could compensate for the massive res that is 3*1600p? You could probably run that but not at >100Hz don't you think?

Also, have you looked into all the hype that's been going around on OCN about those OCable Catleaps?


----------



## nvidiaftw12

That's by far one the highest I have ever seen on Metro 2033. Hell, I think two asus mars 2 cards could barley get 49 avg.


----------



## CallsignVega

Quote:


> Originally Posted by *d3_deeb*
> 
> Hm.. So you expect that Quad-SLI scaling could compensate for the massive res that is 3*1600p? You could probably run that but not at >100Hz don't you think?
> Also, have you looked into all the hype that's been going around on OCN about those OCable Catleaps?


Ya, I won't be running 3x 2560x1600. 3x 1920x1200 seems to the sweet spot and Quad 680 should do really well. Although I can comfortably bump that up to 3x 2304x1440.

Ya, I've seen the Korean monitor threads. Even though their internal circuitry can overclock some, they still have slow IPS pixel response times which causes massive blur. If 3-5x 120Hz TN panels are too slow compared to my CRT's, those are even more-so. You would just have to view my setup in person to understand what blur-free lifelike motion with a near seemless image is like.







I am still surprised when I use it!

LCD tech is just inferior for gaming. I think the only thing that will be able to replace my setup is OLED or direct-view surface LED's if they ever get the pixels small enough.


----------



## l88bastar

Vega don't worry about the lackluster sli scaling in BF3 with the Nvidia 680 gtxs cause AMD is gonna release some mature drivers which should clear all of that right up!


----------



## CallsignVega

Quote:


> Originally Posted by *l88bastar*
> 
> Vega don't worry about the lackluster sli scaling in BF3 with the Nvidia 680 gtxs cause AMD is gonna release some mature drivers which should clear all of that right up!


lol. If you look at the chart BF3 had 94% scaling in my tests. That be pretty good!









This sucks not being able to play any games in my Surround setup until I get the 3rd 680/new MB sorted out. Maybe I need to hook up my 4th FW900 on a side table in landscape to use until then.


----------



## CallsignVega

Hmm, I was thinking since I am at a decision point with a new motherboard, what would you guys do. This MB will be for my full-on 4-way GTX 680 setup.

Just go ahead and purchase a X79 board now like the Rampage IV Extreme, MSI Big Bang - Xpower II or ASRock Fatal1ty X79 that can do native PCI-E 3.0 at 16x/8x/8x/8x with a 3820/3930k/3960k? Or just get a temp Z68 MB to run 3-way GTX 680 and then purchase the Gigabyte Z77 Sniper 3 when it launches with a 3770k that with a PLX chip can do 8x/8x/8x/8x? I wonder if that PLX chip will lower performance compared to the native speeds of X79.

I am kinda weary about the SB-E chips even under chilled water barely being capable of 5GHz and then regret the purchase if IB launches and can do something like 5.5GHz under chilled water. Thoughts?

Or I could be really crazy and wait for the ASRock Extreme11 X79 that has two PLX chips for 16x/16x/16x/16x but that won't release for another couple months and not sure how much help that super bandwidth will be over native X79 PCI-E 3.0 speeds.


----------



## PoopaScoopa

I'd get a temp Z68 for 3-way. Your 5.2 2700K would be better for gaming till IB is out and gives better SLI scaling than 4.8. 8x 3.0 should be faster due to lower overhead. PLX usually only gives 1 fps different at 1600P.


----------



## rdrdrdrd

Well if I was in your position I would get X79 and then keep it until IB-E drops, or perhaps just get the new SR-X. On the other hand it is kinda overkill, but thats what this system is all about after-all


----------



## Hydroplane

Isn't Ivy Bridge supposed to be out in two weeks?


----------



## TA4K

Quote:


> Originally Posted by *rdrdrdrd*
> 
> Well if I was in your position I would get X79 and then keep it until IB-E drops, or perhaps just get the new SR-X. On the other hand it is kinda overkill, but thats what this system is all about after-all


You would also need to think about clock for clock performance, SB-E isn't that much better than SB in single thread, and it can't OC as far as the 2700K anyways. I think you would be better off getting a 3way capable z68 board, then changing to a Z77 board when they come out. Apparently they should be out soon as the manufacturers have already announced them. The 2700K will run in a Z77 board, so you can go to 4way before the 3770K even comes out. The only drawback will be 8x/8x/8x/8x, first at PCIE 2.0, then at 3.0. You will probably not see that much of a performance loss with 8/8/8/8 2.0, because you will not have experienced 8/8/8/8 3.0 unless you have benches for 16/16/16/16 2.0, which don't even exist anyway AFAIK. Just my 2c.


----------



## rdrdrdrd

Quote:


> Originally Posted by *TA4K*
> 
> You would also need to think about clock for clock performance, SB-E isn't that much better than SB in single thread, and it can't OC as far as the 2700K anyways. I think you would be better off getting a 3way capable z68 board, then changing to a Z77 board when they come out. Apparently they should be out soon as the manufacturers have already announced them. The 2700K will run in a Z77 board, so you can go to 4way before the 3770K even comes out. The only drawback will be 8x/8x/8x/8x, first at PCIE 2.0, then at 3.0. You will probably not see that much of a performance loss with 8/8/8/8 2.0, because you will not have experienced 8/8/8/8 3.0 unless you have benches for 16/16/16/16 2.0, which don't even exist anyway AFAIK. Just my 2c.


but..... moar cores.... lol I think that trying to spend frugaly in this case the quads make more sense, but the extra power, while not needed per-say, is awesome enough to sacrifice maybe 1-2% in single threaded apps.


----------



## CallsignVega

Quote:


> Originally Posted by *Hydroplane*
> 
> Isn't Ivy Bridge supposed to be out in two weeks?


Thanks for all of the thoughts guys. To Hydroplane, it looks like April 29th according to the latest leaks, which is still over a month away. The leaks also say Intel is holding onto Z77 boards for the launch so early Z77 is also a no-go. That's too long to be without my surround setup.









Maybe I will do a X79 vs Z77 4-way GTX 680 comparo.


----------



## Lovidore

Quote:


> Originally Posted by *CallsignVega*
> 
> Thanks for all of the thoughts guys. To Hydroplane, it looks like April 29th according to the latest leaks, which is still over a month away. The leaks also say Intel is holding onto Z77 boards for the launch so early Z77 is also a no-go. That's too long to be without my surround setup.
> 
> 
> 
> 
> 
> 
> 
> 
> Maybe I will do a X79 vs Z77 4-way GTX 680 comparo.


I don't suppose you can go wrong with going x79, I don't know what constraints you have when it comes to cash but the x79 gives you some solid upgrade options without thew headache of mobo upgrades at least for another year. Although I'm sure you're already aware of that.


----------



## csm725

The point was that X79 was overkill and unnecessary to max out games better than IB would.


----------



## CallsignVega

5th GTX 680 inbound (mission critical apps like Surround BF3 need's a replacement backup in case of failure). Couldn't let the 4-way SLI grid be down for too long.


----------



## Eggy88

Quote:


> Originally Posted by *CallsignVega*
> 
> 5th GTX 680 inbound (mission critical apps like Surround BF3 need's a replacement backup in case of failure). Couldn't let the 4-way SLI grid be down for too long.


Yes ofc, who does not have a 500$ card as a spare just in case one gets damaged. I guess you have a spare Mustang in the garage just in case your 1'st one breaks down to?


----------



## CallsignVega

ASUS RAMPAGE IV EXTREME or EVGA X79 Classified?

Both have MB water blocks available from EK.


----------



## rubicsphere

Don't go EVGA!!!

I had the EVGA x79 SLi and it was a nightmare


----------



## zosothepage

Quote:


> Originally Posted by *CallsignVega*
> 
> ASUS RAMPAGE IV EXTREME or EVGA X79 Classified?
> Both have MB water blocks available from EK.


i have the R4E and i love it


----------



## zdude

the classy is a dream, I have it and love it go with EVGA, for a noob like me their costumer support sweetens the deal to, but you obviously arn't a noob, go with what you want or better yet get both and tell us which is better.


----------



## nvidiaftw12

Quote:


> Originally Posted by *CallsignVega*
> 
> ASUS RAMPAGE IV EXTREME or EVGA X79 Classified?
> Both have MB water blocks available from EK.


How about a Msi big bang xpower II? Or you could try all since you clearly have the money.


----------



## CallsignVega

Quote:


> Originally Posted by *rubicsphere*
> 
> Don't go EVGA!!!
> I had the EVGA x79 SLi and it was a nightmare


BIOS issues? I've read some that EVGA BIOS work is kinda slacking lately.
Quote:


> Originally Posted by *zosothepage*
> 
> i have the R4E and i love it


You getting rid of it for the Gigabyte Sniper 3 Z77?
Quote:


> Originally Posted by *zdude*
> 
> the classy is a dream, I have it and love it go with EVGA, for a noob like me their costumer support sweetens the deal to, but you obviously arn't a noob, go with what you want or better yet get both and tell us which is better.


No BIOS problems?
Quote:


> Originally Posted by *nvidiaftw12*
> 
> How about a Msi big bang xpower II? Or you could try all since you clearly have the money.


That board looks pretty sweet but I am not finding any MB water blocks for it plus I've had quite a few problems with my current MSI board. Kinda makes me shy away from a company when they previous products don't do so well.


----------



## zdude

No, however when I did try Updating the BIOS my CPU threw a temper tantrum, but it turned out to be me and the fact that i have a crappy OCing CPU.

Other than that no problems whatsoever and am perfectly happy


----------



## iCrap

I would get the ASUS. Never had any issues with my ASUS stuff


----------



## rdrdrdrd

Rampage, ASUS is solid


----------



## PCModderMike

Subadubdub







On board to catch as much of this as possible


----------



## Ghooble

Quote:


> Originally Posted by *CallsignVega*
> 
> 5th GTX 680 inbound (mission critical apps like Surround BF3 need's a replacement backup in case of failure). Couldn't let the 4-way SLI grid be down for too long.










Got a spare house with a spare Dodge Challenger just in case as well?!


----------



## Hellish

Go X79 with the 3930k/3960X and just get the intel performance tuning protection plan and pump out what ever volts you need for 5.2ghz+ and if it dies they give you a new cpu


----------



## zosothepage

@vega no brother i really like it i have it paired with a 3960X @5.2 it have got it up to 5.5GHZ but not 24/7 my father and i own a computer shop and we have a few R4E in stock Message me if you if are interested i can get you a really great price on one


----------



## CallsignVega

Quote:


> Originally Posted by *zosothepage*
> 
> @vega no brother i really like it i have it paired with a 3960X @5.2 it have got it up to 5.5GHZ but not 24/7 my father and i own a computer shop and we have a few R4E in stock Message me if you if are interested i can get you a really great price on one


Sweet, what voltage for 5.2GHZ? Water?


----------



## CallsignVega

Sweet, got a Rampage IV Extreme and a nice clocking 3960X inbound and the other three GTX 680's in the mail. Now I just need to find some of that 2666 MHz DDR3 4x 4GB. Or do you guys think 2400MHz will suffice?

I might have to re-think my cooling loop on the CPU side to give it more breathing room now.


----------



## PoopaScoopa

Hope you upgraded the circuit breakers since last time








If all you're doing is gaming, the RAM won't matter.


----------



## wongwarren

Quote:


> Originally Posted by *CallsignVega*
> 
> Sweet, got a Rampage IV Extreme and a nice clocking 3960X inbound and the other three GTX 680's in the mail. Now I just need to find some of that 2666 MHz DDR3 4x 4GB. Or do you guys think 2400MHz will suffice?
> I might have to re-think my cooling loop on the CPU side to give it more breathing room now.


The only 2600 MHz RAM I can find on NewEgg. It's 4 X 4 GB and it costs as much as a GTX 680.


----------



## CallsignVega

I've never heard of "Team" memory before. Any good?


----------



## coolhandluke41

yes it's good
http://www.overclock.net/t/1229517/team-xtreme-lv-2400-9-11-11-28
this will also be available in near future (any day now)
http://www.overclock.net/t/1226434/gskill-shows-quad-channel-ddr3-kits-at-2666-mhz-1-65v


----------



## CallsignVega

Quote:


> Originally Posted by *coolhandluke41*
> 
> yes it's good
> http://www.overclock.net/t/1229517/team-xtreme-lv-2400-9-11-11-28
> this will also be available in near future (any day now)
> http://www.overclock.net/t/1226434/gskill-shows-quad-channel-ddr3-kits-at-2666-mhz-1-65v


Those 2666 G.Skill timing are getting a bit high. I went with:

http://www.newegg.com/Product/Product.aspx?Item=N82E16820313243


----------



## Nocturin

Quote:


> Originally Posted by *CallsignVega*
> 
> Sweet, got a Rampage IV Extreme and a nice clocking *3960X* inbound and the other three GTX 680's in the mail. Now I just need to find some of that 2666 MHz DDR3 4x 4GB. Or do you guys think 2400MHz will suffice?
> I might have to re-think my cooling loop on the CPU side to give it more breathing room now.


Why this and no the 3930k? I understand excessive, but I thought you were actually trying to avoid that







.

I'll never understand intel extreme pricing.


----------



## CallsignVega

Quote:


> Originally Posted by *Nocturin*
> 
> Why this and no the 3930k? I understand excessive, but I thought you were actually trying to avoid that
> 
> 
> 
> 
> 
> 
> 
> .
> I'll never understand intel extreme pricing.


I bought Loud_Silences used 3960X that clocks quite nicely for only a couple hundred more than a new 3930K, so it works out well.

This is the chip:

http://hwbot.org/submission/2236455_l0ud_sil3nc3_superpi_core_i7_3960x_6sec_563ms

Hopefully I will be able to run it around 5.2GHz under chilled water.


----------



## Canis-X

Nice, and congrats on your acquisitions!!


----------



## Op125

Quote:


> Originally Posted by *CallsignVega*
> 
> I bought Loud_Silences used 3960X that clocks quite nicely for only a couple hundred more than a new 3930K, so it works out well.
> This is the chip:
> http://hwbot.org/submission/2236455_l0ud_sil3nc3_superpi_core_i7_3960x_6sec_563ms
> Hopefully I will be able to run it around 5.2GHz under chilled water.


I have noticed that you are running your chips at fairly high frequencies. I seem to remember that you ran your 990X at 5GHz and you're running your 2700K at 5.2GHz. But just out of curiosity; what kind of voltages are you feeding these chips? What is considered a safe 24/7 voltage with the chilled water cooling that you have?


----------



## CallsignVega

Quote:


> Originally Posted by *Op125*
> 
> I have noticed that you are running your chips at fairly high frequencies. I seem to remember that you ran your 990X at 5GHz and you're running your 2700K at 5.2GHz. But just out of curiosity; what kind of voltages are you feeding these chips? What is considered a safe 24/7 voltage with the chilled water cooling that you have?


I usually run my chips 1.5 - 1.52v. Never had any problems. The key is keeping temps down.


----------



## TA4K

Quote:


> Originally Posted by *CallsignVega*
> 
> I usually run my chips 1.5 - 1.52v. Never had any problems. The key is keeping temps down.


lol yes. 1.5v is fun, especially when getting a 4.16ghz 45nm pentuim Dual Core stable with an H60


----------



## CallsignVega

Quote:


> Originally Posted by *TA4K*
> 
> lol yes. 1.5v is fun, especially when getting a 4.16ghz 45nm pentuim Dual Core stable with an H60


Nice.

Well I broke down and have four of these inbound:



I can't help it, EK water blocks draw me in like crack to a crack-whore!









I am going to hold off on the Rampage IV Extreme motherboard VRM and chip-set liquid blocks and the four RAM blocks until I am sure that is the route I want to take for my permanent setup.


----------



## Alatar

I'd make sure you keep some hefty airflow over the VRM area with the RIVE then. Personally my chip started throttling at around 5GHz without a fan blowing straight to the mosfets. Using around 1.5v.

Chips are power hungry and the small space for the power delivery between the ram slots isn't helping. But as long as you have active cooling or water on it it should be fine.

Also the EK mobo blocks for the RIVE are quite nice, liking mine









Really nice build you've got here.









I wanna order 5 680s too .__.


----------



## lannshine

Hi Vega

What is the monitor stand you are using? Did you buy a stock one or made your own?


----------



## PoopaScoopa

Quote:


> Originally Posted by *lannshine*
> 
> Hi Vega
> What is the monitor stand you are using? Did you buy a stock one or made your own?


See his previous post: http://www.overclock.net/t/1220962/vegas-heavyweight-display-and-computer-edition-2012/100_100#post_16724933
Quote:


> Originally Posted by *CallsignVega*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Haha those aren't just ordinary bricks. They are topped with a special NASA developed synthetic Acrylonitrile Butadiene Carboxy Monomer specially developed to secure in-place and withstand the intense pressure associated with the FW900.


----------



## nvidiaftw12

Still vote for the xpower2


----------



## CallsignVega

Quote:


> Originally Posted by *Alatar*
> 
> I'd make sure you keep some hefty airflow over the VRM area with the RIVE then. Personally my chip started throttling at around 5GHz without a fan blowing straight to the mosfets. Using around 1.5v.
> Chips are power hungry and the small space for the power delivery between the ram slots isn't helping. But as long as you have active cooling or water on it it should be fine.
> Also the EK mobo blocks for the RIVE are quite nice, liking mine
> 
> 
> 
> 
> 
> 
> 
> 
> Really nice build you've got here.
> 
> 
> 
> 
> 
> 
> 
> 
> I wanna order 5 680s too .__.


Ya, I will have some fans blowing down on the board and the overclock won't be super high until everything goes under water.
Quote:


> Originally Posted by *nvidiaftw12*
> 
> Still vote for the xpower2


No doubt a good board, but the RIVE is about as good as X79 get's from what I've read and I am still a bit iffy on MSI considering the problems I've had on my current Z68 one.


----------



## nvidiaftw12

Quote:


> Originally Posted by *CallsignVega*
> 
> Ya, I will have some fans blowing down on the board and the overclock won't be super high until everything goes under water.
> No doubt a good board, but the RIVE is about as good as X79 get's from what I've read and I am still a bit iffy on MSI considering the problems I've had on my current Z68 one.


Well from what I've seen it clocks the highest, and is ok with lots of volts.


----------



## Op125

The xpower II looks promising. The commercial alone makes me wanna upgrade to x79









http://www.youtube.com/watch?v=Es3jV7dS74Y

And EK have announced that they will make mosfet waterblocks for the xpower II;
http://www.ekwaterblocks.com/index.php?mact=News,cntnt01,detail,0&cntnt01articleid=134&cntnt01returnid=17

But the Rampage IV is a safe bet I guess, but I find it to be a little undramatic for my taste.


----------



## Nocturin

Quote:


> Originally Posted by *Op125*
> 
> The xpower II looks promising. The commercial alone makes me wanna upgrade to x79
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.youtube.com/watch?v=Es3jV7dS74Y
> And EK have announced that they will make mosfet waterblocks for the xpower II;
> http://www.ekwaterblocks.com/index.php?mact=News,cntnt01,detail,0&cntnt01articleid=134&cntnt01returnid=17
> But the Rampage IV is a safe bet I guess, but I find it to be a little undramatic for my taste.


You had me going until I saw the chipset cooler. Ewwwwww.

Police: His computer is rigged to self destruct, cut the power now!
Owner: It's a heatsink!!!!!!!
Police: Mr. Hacker, you have rigged your computer to explode in the event of us appearing
Owner: I'm a freaking gamer?!!?!
Police: Meet Mr. Taser bad computer man.


----------



## armartins

Vega (and the other engineering minds lurking here) please give me your thoughts about what is possible doing to solve my debezel problem. Take a look please: http://www.overclock.net/t/1236177/video-need-urgent-help-with-dell-u2412m-debezel-mod#post_16840846


----------



## Op125

Quote:


> Originally Posted by *Nocturin*
> 
> You had me going until I saw the chipset cooler. Ewwwwww.
> Police: His computer is rigged to self destruct, cut the power now!
> Owner: It's a heatsink!!!!!!!
> Police: Mr. Hacker, you have rigged your computer to explode in the event of us appearing
> Owner: I'm a freaking gamer?!!?!
> Police: Meet Mr. Taser bad computer man.


Well, about the heatsink I really agree with you. If I was to buy this board I would demand a fullcover block to replace the guns.


----------



## CallsignVega

Quote:


> Originally Posted by *nvidiaftw12*
> 
> Well from what I've seen it clocks the highest, and is ok with lots of volts.


Quote:


> Originally Posted by *Op125*
> 
> The xpower II looks promising. The commercial alone makes me wanna upgrade to x79
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.youtube.com/watch?v=Es3jV7dS74Y
> And EK have announced that they will make mosfet waterblocks for the xpower II;
> http://www.ekwaterblocks.com/index.php?mact=News,cntnt01,detail,0&cntnt01articleid=134&cntnt01returnid=17
> But the Rampage IV is a safe bet I guess, but I find it to be a little undramatic for my taste.


I won't be doing any 6GHz LN2 benching or anything like that, so a RIVE on my chilled water should do 5.2GHz or so without breaking a sweat. I also went with the RIVE as it has a full MB water-block, versus X-Power II just MOSFET. I wonder why EK decided not to do a chipset block. Maybe they like the bullets.


----------



## nvidiaftw12

Quote:


> Originally Posted by *CallsignVega*
> 
> *Maybe they like the bullets*.










They look terrible.


----------



## Nocturin

Quote:


> Originally Posted by *nvidiaftw12*
> 
> 
> 
> 
> 
> 
> 
> 
> They look terrible.


Agree most copiously.


----------



## eskamobob1

Quote:


> Originally Posted by *CallsignVega*
> 
> I won't be doing any 6GHz LN2 benching or anything like that, so a RIVE on my chilled water should do 5.2GHz or so without breaking a sweat. I also went with the RIVE as it has a full MB water-block, versus X-Power II just MOSFET. I wonder why EK decided not to do a chipset block. Maybe they like the bullets.


i have no idea... i am so dissapointed about that







... i was going orange purple, and black, and that board would have fit my color scheeme amazingly well (along with all copper EK blocks







) and it has a lot more convinent features then the RIVE TBH
/rant

this looks like its coming along amazingly well... idk what your doing half the time, but i like it


----------



## Ghooble

So do you not use a case?


----------



## 636cc of fury

what kind of phase change setup are you going with Callsign?


----------



## CallsignVega

Quote:


> Originally Posted by *636cc of fury*
> 
> what kind of phase change setup are you going with Callsign?


My previous setup that has been in storage for a few months, phase change to chilled liquid (not direct contact phase change):





































This time without all of the ectoplasm on everything.









What do you guys think of a completely air-tight sealed clear container housing the computer with a crap load of molecular desiccant in it? That would drop the humidty below 1%. If there is no humidity, there is nothing to condense. Not sure if that low of humidity would negatively effect the components normal operation. IE, would rubber liquid seals, shrink etc?

Supposedly molecular sieve had an awesome affinity for water and it's not expensive either:

http://www.dkhardware.com/product-26020-msd5-molecular-sieve-adsorbent-5-pounds.html

http://en.wikipedia.org/wiki/Molecular_sieve

Quote:


> Originally Posted by *Ghooble*
> 
> So do you not use a case?


Nope, no case. I hang out with my wang out. (when's the last time you heard that phrase)


----------



## Ghooble

Quote:


> Originally Posted by *CallsignVega*
> 
> Nope, no case. I hang out with my wang out. (when's the last time you heard that phrase)


It's been a long time man lol


----------



## dtfgator

Quote:


> Originally Posted by *CallsignVega*
> 
> What do you guys think of a completely air-tight sealed clear container housing the computer with a crap load of molecular desiccant in it? That would drop the humidty below 1%. If there is no humidity, there is nothing to condense. Not sure if that low of humidity would negatively effect the components normal operation. IE, would rubber liquid seals, shrink etc?
> Nope, no case. I hang out with my wang out. (when's the last time you heard that phrase)


Hmm.. What if you just sealed it up, made sure all the connections were good, put some sinks with thermal adhesive on all the FET's and such, and then submerged the thing in mineral oil? Obviously you would have to put a sheet of something on the surface to prevent water from condensing on the tubes and dripping into the oil, but otherwise it should work fairly well!

You could also use 3M Novec if you want to shell out for a more expensive fluid. I hear that stuff is great for cooling because of its low boiling point (almost immediate phase change at operational temps), and it can be recycled just by being in a closed container and allowed to condense on the top and drip back in.


----------



## nvidiaftw12

Quote:


> Originally Posted by *dtfgator*
> 
> Hmm.. What if you just sealed it up, made sure all the connections were good, put some sinks with thermal adhesive on all the FET's and such, and then submerged the thing in mineral oil? Obviously you would have to put a sheet of something on the surface to prevent water from condensing on the tubes and dripping into the oil, but otherwise it should work fairly well!
> You could also use 3M Novec if you want to shell out for a more expensive fluid. I hear that stuff is great for cooling because of its low boiling point (almost immediate phase change at operational temps), and it can be recycled just by being in a closed container and allowed to condense on the top and drip back in.


He should be able to afford novec just fine.


----------



## CallsignVega

Quote:


> Originally Posted by *dtfgator*
> 
> Hmm.. What if you just sealed it up, made sure all the connections were good, put some sinks with thermal adhesive on all the FET's and such, and then submerged the thing in mineral oil? Obviously you would have to put a sheet of something on the surface to prevent water from condensing on the tubes and dripping into the oil, but otherwise it should work fairly well!
> You could also use 3M Novec if you want to shell out for a more expensive fluid. I hear that stuff is great for cooling because of its low boiling point (almost immediate phase change at operational temps), and it can be recycled just by being in a closed container and allowed to condense on the top and drip back in.


Quote:


> Originally Posted by *nvidiaftw12*
> 
> He should be able to afford novec just fine.


I have given serious thought to submersion condensation protection. Mineral oil has it's problems and Novec is $500 a gallon. It would take 6-8 gallons to properly submerge the setup, so we are talking $3,000-$4,000. I am crazy but not that crazy.









Could you imagine springing a leak on that tank LOL. Then any fluid submersion has issues with maintenance and cleaning of items when they are end of life/sold.

I am looking more into the clear sealed container idea with zero humidity using molecular sieve. That would allow the components to not need any insulation protection at all and still look good, would require little maintenance and also be very cheap. I like the approach of instead of creating a barrier for the humidity to not reach the components, why not just remove the humidity from the equation?


----------



## Canis-X

Good idea! When, and more importantly how, are you going to do it?


----------



## nvidiaftw12

You will need to replace all the air in the box with no humidity air. That would be hard. Also, you would need a perfect silicon seal.


----------



## CallsignVega

Quote:


> Originally Posted by *nvidiaftw12*
> 
> You will need to replace all the air in the box with no humidity air. That would be hard. Also, you would need a perfect silicon seal.


The plan would be to have regular air in the case, then right before it get's sealed up air-tight open a huge amount of the very powerful desiccant in the corners of the "case" and then seal it. The pressure will stay generally the same with the outside atmosphere and the fluctuation in pressure shouldn't be too great to cause an air leak/humidity exchange. If I need to open the case to work on anything I would just replace the desiccant with new stock and re-seal it.


----------



## dtfgator

Quote:


> Originally Posted by *CallsignVega*
> 
> The plan would be to have regular air in the case, then right before it get's sealed up air-tight open a huge amount of the very powerful desiccant in the corners of the "case" and then seal it. The pressure will stay generally the same with the outside atmosphere and the fluctuation in pressure shouldn't be too great to cause an air leak/humidity exchange. If I need to open the case to work on anything I would just replace the desiccant with new stock and re-seal it.


You could also cut a hole in the bottom and top of the case with quick disconnect hoses on both, and then fill up the thing with something heavier than air from the bottom, like CO2 (Or Sulfur Hexafluoride) forcing all the air out of the top.. Then disconnect the valve on the top and bottom, and you should have a pure CO2 environment with no humidity (theoretically you pushed it all out). You could also toss a desiccant in there to help the process.

Edit: In the process, I would fill the thing from the bottom (with a dense gas) and have the top quick disconnect attached to nothing so it leaks. Then put a match up to the top tube and let it run. When the CO2 begins to come out of the tube, it will extinguish the match, and then you just need to let it feed for a little while longer. Disconnect the top valve first (so it seals) and then the bottom after you have built a little pressure. As long as the thing is completely airtight, you are good to go! A perfectly dry environment with nothing to condense!


----------



## Nocturin

Calcium Carbonate is cheap and would work very well too.


----------



## rdrdrdrd

Quote:


> Originally Posted by *dtfgator*
> 
> You could also cut a hole in the bottom and top of the case with quick disconnect hoses on both, and then fill up the thing with something heavier than air from the bottom, like CO2 (Or Sulfur Hexafluoride) forcing all the air out of the top.. Then disconnect the valve on the top and bottom, and you should have a pure CO2 environment with no humidity (theoretically you pushed it all out). You could also toss a desiccant in there to help the process.


this sounds freaking awesome, maybe use argon or another nobel gas, or have a helium 'bubble' inside an inverted aquarium, maybe make it gasometer style


----------



## CallsignVega

Ya, good ideas guys.


----------



## CallsignVega

Working on the ambient side of the loop today. Setting up to test the 3960X and RIVE under water.










Mocking up components to test routing of liquid lines. 1/2" ID - 3/4" OD Norprene tubing is hard to bend and fit in tight places.










The Team Group 2400 9-11-11-28 RAM came in (4x 4GB).










Working on some of the supply/return valve systems.










Made a custom stand out of some old Ikea speaker stands.










Reservoir up top with a couple silver kill coils.










Liquid line routing. The open ended valves that are shut will attach to the Geo-thermal section of the cooling loop.










Testing out the loop and checking for leaks.










Getting rid of air in the system as been a huge PITA. I am going to have to come up with some sort of custom pump/system to force water through the system and flush all the air out under pressure. The Iwaki RD-30 is a beast of a pump in a closed system but if there is some air in the lines it has a hard time getting going. The system already used 1 gallon of distilled water and I ran out so I wasn't able to fire the rig up. Tomorrow is another day.


----------



## PoopaScoopa

I hope the silver and the aluminum fins don't give you any troubles. That is one amazing setup.


----------



## zdude

man i wish i had your money.


----------



## someonewhy

Will it run mario?


----------



## dtfgator

Quote:


> Originally Posted by *someonewhy*
> 
> Will it run mario?


Of course not, you need at least 4 680's for that.


----------



## TheBadBull

Quote:


> Originally Posted by *dtfgator*
> 
> Quote:
> 
> 
> 
> Originally Posted by *someonewhy*
> 
> Will it run mario?
> 
> 
> 
> Of course not, you need at least 4 680's for that.
Click to expand...

*cough*that'swhathe'sgot.*cough*


----------



## zdude

Quote:


> Originally Posted by *TheBadBull*
> 
> *cough*that'swhathe'sgot.*cough*


*cough*nohehasfive680's*cough*


----------



## TheBadBull

Quote:


> Originally Posted by *zdude*
> 
> Quote:
> 
> 
> 
> Originally Posted by *TheBadBull*
> 
> *cough*that'swhathe'sgot.*cough*
> 
> 
> 
> *cough*nohehasfive680's*cough*
Click to expand...

*cough*yes,_at least._*cough*


----------



## CallsignVega

Quote:


> Originally Posted by *PoopaScoopa*
> 
> I hope the silver and the aluminum fins don't give you any troubles. That is one amazing setup.


I am thinking about removing the aluminum fins from the phase-change coolant tank as they aren't doing much. The copper tubes should do the heat transfer just fine on their own. So there would be no aluminum in the loop. I have some time to work on that though as the phase change part of the loop isn't connected yet. I have next week off from work so I'd like to make some serious progress on the Geo-thermal side of the loop which is interesting me the most. Due to it's cool temperature producing water (11-13C) and virtual zero energy use, it will be the best part of the cooling aspect of this project IMO.


----------



## CallsignVega

I am thinking of re-arranging the cooling loop so that:

Branch #1 = CPU > X79 > VRM/MOSFET

Branch #2 = 680 > 680 > 2x RAM Sticks

Branch #3 = 680 > 680 > 2x RAM Sticks

I think the balance between those would be fairly close. The VRM/MOSFET coolers are really low restriction and would pretty much balance the resistance of the 2x RAM sticks. So essentially it will be one CPU block versus two GPU blocks to balance resistance. Anyone think the resistance balance would be way off on the above configuration? (Doesn't need to be perfect)


----------



## zdude

why not use a pump for each branch and not worry about resistance?


----------



## rdrdrdrd

Quote:


> Originally Posted by *CallsignVega*
> 
> Getting rid of air in the system as been a huge PITA. I am going to have to come up with some sort of custom pump/system to force water through the system and flush all the air out under pressure. The Iwaki RD-30 is a beast of a pump in a closed system but if there is some air in the lines it has a hard time getting going. The system already used 1 gallon of distilled water and I ran out so I wasn't able to fire the rig up. Tomorrow is another day.


Have you tried putting one end into a tank of water and sucking the water in through the tubes? Seems like it would be better than trying to force the air out with the water.


----------



## CallsignVega

Quote:


> Originally Posted by *zdude*
> 
> why not use a pump for each branch and not worry about resistance?


Multiple pumps = more heat dump and more chance of failure. The RD-30 is capable of 37 feet of head pressure so I don't even know why I am worrying about resistance. It is so powerful I actually run it at 18v instead of its normal 24v. That might change though when I hook up all the Geo-thermal stuff
Quote:


> Originally Posted by *rdrdrdrd*
> 
> Have you tried putting one end into a tank of water and sucking the water in through the tubes? Seems like it would be better than trying to force the air out with the water.


Ya, I think I may put in a valve just before the pump so I can hook up a reservoir and it can pump the water through directly and push the air out.


----------



## rdfloyd

Quote:


> Originally Posted by *CallsignVega*
> 
> Ya, I think I may put in a valve just before the pump so I can hook up a reservoir and it can pump the water through directly and push the air out.


Why not just get a shop vac to pull water through, so that most of the air is gone? Stick one end in a 5 gallon bucket with water, then hock the vacuum up to the other end.


----------



## Nocturin

Wait. How did you lose water?

That's not a good sign...?

Is the pump having issues staying primed?

Would a vacuum pump work?


----------



## Aaron_Henderson

Pardon my asking, but where did you purchase the fresnel lens you used with the Sony CRTs? I've not read through the entire thread, so I apologize if I have missed it.


----------



## PoopaScoopa

Quote:


> Originally Posted by *rdfloyd*
> 
> Why not just get a shop vac to pull water through, so that most of the air is gone? Stick one end in a 5 gallon bucket with water, then hock the vacuum up to the other end.


Wouldn't that pollute the water?


----------



## CallsignVega

Quote:


> Originally Posted by *rdfloyd*
> 
> Why not just get a shop vac to pull water through, so that most of the air is gone? Stick one end in a 5 gallon bucket with water, then hock the vacuum up to the other end.


Quote:


> Originally Posted by *Nocturin*
> 
> Wait. How did you lose water?
> That's not a good sign...?
> Is the pump having issues staying primed?
> Would a vacuum pump work?


No worries, no lost water and the system is purged. 3960X idling at 18C, sweet.








Quote:


> Originally Posted by *Aaron_Henderson*
> 
> Pardon my asking, but where did you purchase the fresnel lens you used with the Sony CRTs? I've not read through the entire thread, so I apologize if I have missed it.


Here ya go:

http://www.amazon.com/Kantek-Monitor-Magnifier-Widescreen-MAG19WL/dp/B002JGKI6O/ref=sr_1_1?ie=UTF8&qid=1333336315&sr=8-1


----------



## Lovidore

How do you go about switching the valve between each cooling system and how do you decide when to use what?


----------



## IrishV8

Holy hot glue Batman


----------



## Nocturin

Good to hear everything working out alright









*waits on benches*


----------



## AMD_Freak

Quote:


> Originally Posted by *zdude*
> 
> man i wish i had your money.


he don't have any money can't you tell........................

he's done spent it all


----------



## CallsignVega

Got my RAID 0 setup (boy that was a nightmare on X79), Win 7 installed. Games are downloading. Got three GTX 680's now. This is why I love nVidia:










Even something as complicated as running three CRT's in portrait is a snap. Install driver, hit configure Surround displays, bam - organize screens and your done. Even these early drivers work really well. Thankfully nVidia allows each card to use it's own RAMDAC for each FW900, something AMD cannot do.

Setup is kinda a mess while I install and test stuff:










The 3960X at stock setting under maximum-Intel Burn Test only reaches a max temp of 41 C on the cores using the ambient radiator. I used Thermaltake Chill Factor III this time around and it appears to be doing quite well.


----------



## stu.

It's so beautiful...


----------



## wongwarren

Do not dedicate PhysX to CPU. It sucks.


----------



## Volkovy87

holy ninja baseball batman.


----------



## CallsignVega

Expect PCI-E 3.0 vs 2.0 tests, what is required to reach 2GB of VRAM and what happens when that VRAM is reached tests etc. So far in BF3 the results are pretty bad news once the memory reaches 2048MB! (Although it takes quite a bit to surpass the 2GB VRAM amount even at extremely high resolution and settings). More to follow...


----------



## PoopaScoopa

Kill Windows Aero and watch it free up 500MB of VRAM and sometimes 900MB RAM. "net stop uxsms"


----------



## Smo

Quote:


> Originally Posted by *PoopaScoopa*
> 
> Kill Windows Aero and watch it free up 500MB of VRAM and sometimes 900MB RAM. "net stop uxsms"


Blimey O_O.


----------



## axipher

Quote:


> Originally Posted by *CallsignVega*
> 
> Setup is kinda a mess while I install and test stuff:


Looking awesome man, and what exact tubing is that? Where did you get it from, I've been looking for some good quality neoprene tubing but it's availability in Canada is a little limited.


----------



## PCModderMike

Quote:


> Originally Posted by *wongwarren*
> 
> Do not dedicate PhysX to CPU. It sucks.


He didn't, dedicate to PhysX is not selected.


----------



## $ilent

whats all that on the motherboard? and hows the 3770k vegas?


----------



## Lost4468

What's the point of the Fresnel lense?


----------



## PatrickCrowely

Devastating Rig.... Will be keeping up with Your progress....


----------



## CallsignVega

Quote:


> Originally Posted by *PoopaScoopa*
> 
> Kill Windows Aero and watch it free up 500MB of VRAM and sometimes 900MB RAM. "net stop uxsms"


Ya, I will include tests with Aero on and with it off to compare how it affects the VRAM limit.
Quote:


> Originally Posted by *axipher*
> 
> Looking awesome man, and what exact tubing is that? Where did you get it from, I've been looking for some good quality neoprene tubing but it's availability in Canada is a little limited.


It is 1/2" ID / 3/4" OD Norprene A-60-G. It's a little pricey but it's pretty much the best tubing for water loops. Very chemical resistant, flexible and very low permeability.
http://www.professionalplastics.com/NORPRENEA-60-G

You should be able to find someone that ships to Canada with Google.
Quote:


> Originally Posted by *$ilent*
> 
> whats all that on the motherboard? and hows the 3770k vegas?


You must mean my UD9 X58 board. That was ecto-plasm for condensation protection.







Actually it was Dragon Skin liquid rubber applied over insulation foam in key areas.
Quote:


> Originally Posted by *Lost4468*
> 
> What's the point of the Fresnel lense?


Fresnel lenses allow the images of the CRT's to converge to a near-seamless gap between the displays. This allows a large, near perfect image when seated in the proper viewing position. They also have the benefit of making each screen appear slightly larger (magnification) and add's a unique depth effect.

I love this new nVIdia Surround. It keeps the desktop task-bar only on the center monitor and when I maximize windows it only maximizes on the center screen. Awesome features! With the simple registry edit, I've got all of the cards running at PCI-E 3.0. In the BIOS I can switch between 1.0/2.0/3.0 at will so this will make for some nice tests.

Who want's to bet there will be appreciable differences on my setup?


----------



## axipher

Quote:


> Originally Posted by *CallsignVega*
> 
> Quote:
> 
> 
> 
> Originally Posted by *axipher*
> 
> Looking awesome man, and what exact tubing is that? Where did you get it from, I've been looking for some good quality neoprene tubing but it's availability in Canada is a little limited.
> 
> 
> 
> It is 1/2" ID / 3/4" OD Norprene A-60-G. It's a little pricey but it's pretty much the best tubing for water loops. Very chemical resistant, flexible and very low permeability.
> http://www.professionalplastics.com/NORPRENEA-60-G
> 
> You should be able to find someone that ships to Canada with Google.
Click to expand...

Thanks for the info, I've been looking for some good black tubing, still need to find some white tubing for my other build and some red anti-kink.


----------



## gibsy

hi vega, I think this quite off topic but I'm planning to buy 3x 120hz monitor, but I don't really know what gpu to get, can 2x gtx 680's 2gb sli do the job??thanks


----------



## Nocturin

short answer: yes.

long answer: if your talking about 16xaa and ultra everything, may need a 3rd one. two will be plenty for high settings and 4-8aa.


----------



## CallsignVega

Quote:


> Originally Posted by *gibsy*
> 
> hi vega, I think this quite off topic but I'm planning to buy 3x 120hz monitor, but I don't really know what gpu to get, can 2x gtx 680's 2gb sli do the job??thanks


Ya, I would say if you want to keep FPS up close to 120 to match the refresh rate, 3x 680's would be a safer bet. But if you turn down some settings you can "scrape" by on 2x 680's.


----------



## PCModderMike

Quote:


> Originally Posted by *gibsy*
> 
> hi vega, I think this quite off topic but I'm planning to buy 3x 120hz monitor, but I don't really know what gpu to get, can 2x gtx 680's 2gb sli do the job??thanks


I would worry more about the memory buffer than the actual power of the cards, 2 or 3 680's, either way you still only have 2GB of memory. I use as much as 1900MB of memory during BF3 gameplay on large maps, as reported by afterburner, just on a single 1920x1080 monitor. I can only imagine how much more memory is used once you start getting into triple monitor resolutions. Might start running into memory swap issues. I really thought Nvidia was going to match AMD's practice of selling their cards with more and more memory. Now again, just like with the 580, we have to wait for a double the memory version of the 680.


----------



## PoopaScoopa

Quote:


> Originally Posted by *PCModderMike*
> 
> I would worry more about the memory buffer than the actual power of the cards, 2 or 3 680's, either way you still only have 2GB of memory. I use as much as 1900MB of memory during BF3 gameplay on large maps, as reported by afterburner, just on a single 1920x1080 monitor. I can only imagine how much more memory is used once you start getting into triple monitor resolutions. Might start running into memory swap issues. I really thought Nvidia was going to match AMD's practice of selling their cards with more and more memory. Now again, just like with the 580, we have to wait for a double the memory version of the 680.


Kill Aero. BF3 does not use 1.9GB for 1080P, even with 4xMSAA.


----------



## PCModderMike

Quote:


> Originally Posted by *PoopaScoopa*
> 
> Kill Aero. BF3 does not use 1.9GB for 1080P, even with 4xMSAA.


Good point. That's the highest spike I've seen though, 1900MB of usage is not the norm. How much do you think Aero eats up?


----------



## PoopaScoopa

Quote:


> Originally Posted by *PCModderMike*
> 
> Good point. That's the highest spike I've seen though, 1900MB of usage is not the norm. How much do you think Aero eats up?


I've seen it use 500MB before. http://www.overclock.net/t/1220962/vegas-heavyweight-display-and-computer-edition-2012/300_100#post_16879187 2GB with 2XMSAA is fine for a single 1600P monitor in BF3.


----------



## mtbiker033

the heaveyweight BAUSS!!!

one of, if not the coolest set up I have ever seen. can't wait to see more pics!


----------



## Nocturin

Quote:


> Originally Posted by *CallsignVega*
> 
> ...snip/With the simple registry edit, I've got all of the cards running at PCI-E 3.0. In the BIOS I can switch between 1.0/2.0/3.0 at will so this will make for some nice tests.
> Who want's to bet there will be appreciable differences on my setup?


I'm willing to bet there will be less than %10 difference between 2.0/3.0 using 8x and above, %20-40 difference using 4x and below.

Yes, I am prepared for foot in mouth syndrome







.


----------



## CallsignVega

Quote:


> Originally Posted by *Nocturin*
> 
> I'm willing to bet there will be less than %10 difference between 2.0/3.0 using 8x and above, %20-40 difference using 4x and below.
> Yes, I am prepared for foot in mouth syndrome
> 
> 
> 
> 
> 
> 
> 
> .


10% is huge!


----------



## Hydroplane

Vega your builds inspire me









btw I was the one who commented on your youtube video about pines of rome lol


----------



## Nocturin

Quote:


> Originally Posted by *CallsignVega*
> 
> 10% is huge!


touche, forgot about your massive resolution/millions of pixels







.


----------



## gibsy

so guys, should I wait for 4gb version of 680's or go with 2gb version?im confuse..by the way thanks guys for the output!


----------



## CallsignVega

Well, the results of my PCI Express 2.0 versus 3.0 on my 4-way SLI GTX 680 FW900 Surround setup are in. The results are so incredible I had to start the tests over from scratch and run them multiple times for confirmation!









Test setup:

3960X @ 5.0 GHz (temp slow speed)
Asus Rampage IV Extreme with PCI-E slots running 16x/8x/8x/8x
(4) EVGA GTX 680's running 1191MHz core, 3402 MHz Memory
nVidia Driver 301.10 with PCI-E 3.0 registry adjustment turned on and off for each applicable tes
GPU-Z 0.6.0










After PCI-E settings changed, confirmed with GPU-Z:



















All settings in nVidia control panel, in-game and in benchmark, EVGA precision are all UNTOUCHED between benchmark runs. The only setting adjusted is the PCI-E 2.0 to 3.0 and back and forth for confirmation (Reboots obviously for registry edit).










I kid you not, that is how much PCI-E 2.0 running at 16x/8x/8x/8x versus PCI-E 3.0 bottlenecks BF3 and Heaven 2.5 at these resolutions. I attribute this to the massive bandwidth being transferred over the PCI-E bus. We are talking 4-way SLI at up to 10-megapixels in alternate frame rendering. Entire frames at high FPS are being swapped and PCI-E 2.0 falls on it's face.

The interesting part was that while running PCI-E 2.0, the GPU utilization dropped way down as would be typically seen if you are "CPU" limited. In this instance I am not CPU limited, nor GPU limited. We are really at a point now that you can be PCI-E limited unless you go PCI-E 3.0 8x (16x PCI-E 2.0) or faster on *all GPU*'s in the system. GPU utilization dropped down into the ~50% range due to PCI-E 2.0 choking them to death. As soon as I enabled PCI-E 3.0, the GPU utilization skyrocketed to 95+% on all cores. I was going to run more benchmarks and games but the results are such blow-outs it seems pretty pointless to do any more. It may interest some of those out there running these new PCI-E 3.0 GPU's in which they think they are CPU limited (below 95% GPU utilization) yet might have PCI-E bandwidth issues.

Down to the nitty gritty; if you run a single GPU, yes; a single *16x* speed PCI-E 2.0 slot will be fine. When you start to run multiple GPU's and/or run these new cards at 8x speed, especially in Surround/Eyefinity, make sure to get PCI-E 3.0.


----------



## Acefire

Sorry you have 4x gtx 680's hardcore bottlnecked by 2gb of vram.


----------



## CallsignVega

Quote:


> Originally Posted by *Acefire*
> 
> Sorry you have 4x gtx 680's hardcore bottlnecked by 2gb of vram.


Ya, because simply changing PCI-E slots between 2.0 and 3.0 speeds will affect VRAM usage. Please tell me you are just trolling and not actually that stupid.


----------



## PoopaScoopa

Wow. Now someone needs to send you 4 7970s to see if x8 3.0 is even fast enough to make up for the 900MB/s CF bridge and everything running off the first card.


----------



## armartins

VEGA, since you have a RIVE could you do another simple test so I can put my mind at ease? I'm really interested if, in your resolution of 3600x1920 (close to mine 3840x1920 ) you see any difference from a single VGA in PCI-E 3.0 to 2.0 at 16X. Your tests make it very clear when multi-GPU is involved, but since you MIX multi-GPU and surround resolution I can't really dismiss the possibility of still running in a "worth upgrading" bottleneck with a single card. I guess you'll only need to flip the 3 micro-switches and in 2 reboots you can answer my question =)


----------



## GoldenTiger

Vega... nice tests!

I'm starting to wonder if my 2-way SLI is bottlenecked by having x8 2.0 for each card then... it looks possible (2560x1600). My GPU utilization is high (90%+) at almost all times in games though. Regardless, looking like a day-1 buy for Ivy Bridge for me to enable PCI-E 3.0 on my Z68 Gen3 board.

Is there any chance I could trouble you for a 2-way SLI run at 3600x1920 (PCI-E 2.0 vs 3.0)? Pretty please? That's most comparable to my resolution... though still far greater (4mp vs 7mp).


----------



## armartins

I'll make another post so this won't pass unnoticed in an edit. But just comparing the BF3 results in both resolutions It's pretty evident the role of the resolution in the bottleneck hence we go from 69 to 97FPS. But in the other hand I find the 133 to 140FPS a rather small increase in FPS considering the final resolution decrease. What are your thoughts about it VEGA? Is it possible that we are hitting some kind of "we are still topping the VRAM at 3600x1920, so due to memory swap we are unable to break the wall of 140FPS". Also please tell us the game settings you're playing on and the VRAM usage at 3600X1920! Those results are really making me change my plan to keep my non PCI 3.0 compatible Z68 board until the Z77 boards are priced right 2-3 months after release.


----------



## Ken1649

Quote:


> Originally Posted by *armartins*
> 
> I'll make another post so this won't pass unnoticed in an edit. But just comparing the BF3 results in both resolutions It's pretty evident the role of the resolution in the bottleneck hence we go from 69 to 97FPS. But in the other hand I find the 133 to 140FPS a rather small increase in FPS considering the final resolution decrease. What are your thoughts about it VEGA? Is it possible that we are hitting some kind of "we are still topping the VRAM at 3600x1920, so due to memory swap we are unable to break the wall of 140FPS". Also please tell us the game settings you're playing on and the VRAM usage at 3600X1920! Those results are really making me change my plan to keep my non PCI 3.0 compatible Z68 board until the Z77 boards are priced right 2-3 months after release.


Thinking about the same thing seeing the jump in Heaven 3.0 from 94.8 to 140.3 is quite a disparity.


----------



## drbaltazar

did you know water can be cooled lower then 0 celcuis if i recall up to -8 or -10 celcuis if i recall!been almost 10 years since they spoke about this !cant recall how they did it!do they make water chiller for computer or they re too small!cause normally those have the water line that bring in hot water to the chiler and another pair of water hose go to the chiller!instead of a direct fan they had a second water cooler to cool the primary one going to cpu.gpu etc!


----------



## Arni90

Quote:


> Originally Posted by *armartins*
> 
> I'll make another post so this won't pass unnoticed in an edit. But just comparing the BF3 results in both resolutions It's pretty evident the role of the resolution in the bottleneck hence we go from 69 to 97FPS. But in the other hand I find the 133 to 140FPS a rather small increase in FPS considering the final resolution decrease. What are your thoughts about it VEGA? Is it possible that we are hitting some kind of "we are still topping the VRAM at 3600x1920, so due to memory swap we are unable to break the wall of 140FPS". Also please tell us the game settings you're playing on and the VRAM usage at 3600X1920! Those results are really making me change my plan to keep my non PCI 3.0 compatible Z68 board until the Z77 boards are priced right 2-3 months after release.


Two possible reasons:
CPU bottleneck
PCIe 3.0 x16/x8/x8/x8 still isn't enough.

I'm leaning towards a CPU bottleneck here, BF3 is incredibly CPU-hungry.


----------



## coolhandluke41

Quote:


> Originally Posted by *CallsignVega*
> 
> Well, the results of my PCI Express 2.0 versus 3.0 on my 4-way SLI GTX 680 FW900 Surround setup are in. The results are so incredible I had to start the tests over from scratch and run them multiple times for confirmation!
> 
> 
> 
> 
> 
> 
> 
> 
> Test setup:
> 3960X @ 5.0 GHz (temp slow speed)
> Asus Rampage IV Extreme with PCI-E slots running 16x/8x/8x/8x
> (4) EVGA GTX 680's running 1191MHz core, 3402 MHz Memory
> nVidia Driver 301.10 with PCI-E 3.0 registry adjustment turned on and off for each applicable tes
> GPU-Z 0.6.0
> 
> 
> 
> 
> 
> 
> 
> 
> After PCI-E settings changed, confirmed with GPU-Z:
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> All settings in nVidia control panel, in-game and in benchmark, EVGA precision are all UNTOUCHED between benchmark runs. The only setting adjusted is the PCI-E 2.0 to 3.0 and back and forth for confirmation (Reboots obviously for registry edit).
> 
> 
> 
> 
> 
> 
> 
> 
> I kid you not, that is how much PCI-E 2.0 running at 16x/8x/8x/8x versus PCI-E 3.0 bottlenecks BF3 and Heaven 2.5 at these resolutions. I attribute this to the massive bandwidth being transferred over the PCI-E bus. We are talking 4-way SLI at up to 10-megapixels in alternate frame rendering. Entire frames at high FPS are being swapped and PCI-E 2.0 falls on it's face.
> The interesting part was that while running PCI-E 2.0, the GPU utilization dropped way down as would be typically seen if you are "CPU" limited. In this instance I am not CPU limited, nor GPU limited. We are really at a point now that you can be PCI-E limited unless you go PCI-E 3.0 8x (16x PCI-E 2.0) or faster on *all GPU*'s in the system. GPU utilization dropped down into the ~50% range due to PCI-E 2.0 choking them to death. As soon as I enabled PCI-E 3.0, the GPU utilization skyrocketed to 95+% on all cores. I was going to run more benchmarks and games but the results are such blow-outs it seems pretty pointless to do any more. It may interest some of those out there running these new PCI-E 3.0 GPU's in which they think they are CPU limited (below 95% GPU utilization) yet might have PCI-E bandwidth issues.
> Down to the nitty gritty; if you run a single GPU, yes; a single *16x* speed PCI-E 2.0 slot will be fine. When you start to run multiple GPU's and/or run these new cards at 8x speed, especially in Surround/Eyefinity, make sure to get PCI-E 3.0.


Thanks for doing this Vega ,have to bookmark this for reference







,any chance on SLI comparison => x8x8 vs x16x16 ? (trying to decide between UD5 and G1 Sniper 3)


----------



## CallsignVega

Quote:


> Originally Posted by *PoopaScoopa*
> 
> Wow. Now someone needs to send you 4 7970s to see if x8 3.0 is even fast enough to make up for the 900MB/s CF bridge and everything running off the first card.


I know I wouldn't wan to run 4x 7970's at my resolution at 2.0.








Quote:


> Originally Posted by *armartins*
> 
> VEGA, since you have a RIVE could you do another simple test so I can put my mind at ease? I'm really interested if, in your resolution of 3600x1920 (close to mine 3840x1920 ) you see any difference from a single VGA in PCI-E 3.0 to 2.0 at 16X. Your tests make it very clear when multi-GPU is involved, but since you MIX multi-GPU and surround resolution I can't really dismiss the possibility of still running in a "worth upgrading" bottleneck with a single card. I guess you'll only need to flip the 3 micro-switches and in 2 reboots you can answer my question =)


Quote:


> Originally Posted by *GoldenTiger*
> 
> Vega... nice tests!
> I'm starting to wonder if my 2-way SLI is bottlenecked by having x8 2.0 for each card then... it looks possible (2560x1600). My GPU utilization is high (90%+) at almost all times in games though. Regardless, looking like a day-1 buy for Ivy Bridge for me to enable PCI-E 3.0 on my Z68 Gen3 board.
> 
> Is there any chance I could trouble you for a 2-way SLI run at 3600x1920 (PCI-E 2.0 vs 3.0)? Pretty please? That's most comparable to my resolution... though still far greater (4mp vs 7mp).


Quote:


> Originally Posted by *coolhandluke41*
> 
> Thanks for doing this Vega ,have to bookmark this for reference
> 
> 
> 
> 
> 
> 
> 
> ,any chance on SLI comparison => x8x8 vs x16x16 ? (trying to decide between UD5 and G1 Sniper 3)


To the three quotes above, sorry a minimum of three GTX 680's is required in order to run my three FW900's analog due to RAMDAC connection limitations (only one DVI-I per card).
Quote:


> Originally Posted by *armartins*
> 
> I'll make another post so this won't pass unnoticed in an edit. But just comparing the BF3 results in both resolutions It's pretty evident the role of the resolution in the bottleneck hence we go from 69 to 97FPS. But in the other hand I find the 133 to 140FPS a rather small increase in FPS considering the final resolution decrease. What are your thoughts about it VEGA? Is it possible that we are hitting some kind of "we are still topping the VRAM at 3600x1920, so due to memory swap we are unable to break the wall of 140FPS". Also please tell us the game settings you're playing on and the VRAM usage at 3600X1920! Those results are really making me change my plan to keep my non PCI 3.0 compatible Z68 board until the Z77 boards are priced right 2-3 months after release.


Quote:


> Originally Posted by *Arni90*
> 
> Two possible reasons:
> CPU bottleneck
> PCIe 3.0 x16/x8/x8/x8 still isn't enough.
> I'm leaning towards a CPU bottleneck here, BF3 is incredibly CPU-hungry.


Yes, BF3 ran into a CPU bottleneck going from the higher resolution to the lower resolution. That is why the gap between resolutions is smaller. The 5GHz 3960X could feed BF3 at the highest resolution enough so that the GPU's were maxed out utilization at 133 FPS yet when the demand was lowered with a lower resolution, the FPS only went up to 140 (CPU cap) and the GPU utilization dropped a bit. Granted, 140 FPS cap due to CPU isn't too terrible.







Plus I have at least a few hundred MHz left to go on this 3960X!


----------



## Mhill2029

Quote:


> Originally Posted by *CallsignVega*
> 
> Ya, because simply changing PCI-E slots between 2.0 and 3.0 speeds will affect VRAM usage. Please tell me you are just trolling and not actually that stupid.


I think he's actually that stupid... lol

Those PCI-E 2.0 vs PCI-E 3.0 comparisons seem a little large in difference to be normal lol

From what i've seen on most test sites there is little to no difference, but what you've shown is staggering! Good to know since, i'm going TRI/Quad 680's myself when i can decide on Surround Monitors. I just can't decide.....


----------



## CallsignVega

Quote:


> Originally Posted by *Mhill2029*
> 
> I think he's actually that stupid... lol
> Those PCI-E 2.0 vs PCI-E 3.0 comparisons seem a little large in difference to be normal lol
> From what i've seen on most test sites there is little to no difference, but what you've shown is staggering! Good to know since, i'm going TRI/Quad 680's myself when i can decide on Surround Monitors. I just can't decide.....


I know, was floored when I saw the results too! Just remember though, most tests in the past have been with 1-2 cards max on a single monitor (and even in that limited scenario going from 8x to 16x PCI-E 2.0 gave a 2-5% increase). I have yet to see any PCI-E 2.0 vs 3.0 tests come anywhere near the demands of the PCI-E bus that my setup uses.


----------



## csm725

Quite astonishing results, thanks for taking the time to do this scientifically!


----------



## nvidiaftw12

That's incredible! I would have never thought it would be _that_ huge.


----------



## armartins

Totally forgot about your need for 3xDVIDL... and analogue (right?) support.


----------



## CallsignVega

Hitting VRAM limit and PCI-E 2.0 vs 3.0 battle video tests. Make sure to watch in 480P instead of lower (I forgot to set the camera back to 720P and I don't feel like recording everything all over again lol). Sorry about the video quality but it is still view-able.

http://www.youtube.com/watch?v=S0-xcxAvu54&context=C4435d1aADvjVQa1PpcFOpwHlB70YlOHjBhzZ8mtcWAKgkFdz3LBo=

http://www.youtube.com/watch?v=tkZzssm-kWs&context=C41467ecADvjVQa1PpcFOpwHlB70YlOALsTk3pZxTxt4tbGt5H9Yk=


----------



## gibsy

Quote:


> Originally Posted by *CallsignVega*
> 
> Hitting VRAM limit and PCI-E 2.0 vs 3.0 battle video tests. Make sure to watch in 480P instead of lower (I forgot to set the camera back to 720P and I don't feel like recording everything all over again lol). Sorry about the video quality but it is still view-able.
> http://www.youtube.com/watch?v=S0-xcxAvu54&context=C4435d1aADvjVQa1PpcFOpwHlB70YlOHjBhzZ8mtcWAKgkFdz3LBo=
> http://www.youtube.com/watch?v=tkZzssm-kWs&context=C41467ecADvjVQa1PpcFOpwHlB70YlOALsTk3pZxTxt4tbGt5H9Yk=


nice video vega!..but how did you change something on the "regit" that changes the multiplier to 3.0?can you teach me?


----------



## FcZenitFan

Vega,

Thanks for the test! Looks like I have to get that switch going. Could you point me to what the registry value needs to be adjusted?

Thanks!

Edit: Nevermind, found it. Thanks anyway!


----------



## Methodical

Hey Vega, I just checked out the videos and I have one question. If you did not make the changes to save the 300mb of VRAM how would that have affected the fps and smoothness of the game as it would've been hitting the VRAM limits? Other words, what issues, if any, did you experience or would've experience had you been hitting the VRAM cap with 3.0 enabled? On the flip side would those changes to save those VRAM mb have saved the game's performance while 2.0 was enabled?

Also, can you post your BF3 settings or pm them to me. I noticed you had some settings low, such as texture decorations or something like that. I ask because I am not familiar with all those different settings and I just maxed everything with the 2x 680s. I don't think my settings hurt me much since I am only on a single 1920x1200 monitor though, but curious about your settings.

Can you post what settings you changed to get back those 300mbs of vram? I could not catch it or see what you did on the video.

Btw, nice job on visually showing the affects to back up the graphs so that we can see exactly what you speak of. Visual is always better, at least for me.

Thanks...Al
Quote:


> Originally Posted by *CallsignVega*
> 
> Hitting VRAM limit and PCI-E 2.0 vs 3.0 battle video tests. Make sure to watch in 480P instead of lower (I forgot to set the camera back to 720P and I don't feel like recording everything all over again lol). Sorry about the video quality but it is still view-able.
> 
> http://www.youtube.com/watch?v=S0-xcxAvu54&context=C4435d1aADvjVQa1PpcFOpwHlB70YlOHjBhzZ8mtcWAKgkFdz3LBo=
> 
> http://www.youtube.com/watch?v=tkZzssm-kWs&context=C41467ecADvjVQa1PpcFOpwHlB70YlOALsTk3pZxTxt4tbGt5H9Yk=


----------



## Aaron_Henderson

Way to go Vega, lots of useful stuff in this thread. Thanks for keeping this informative, rather than full of opinion, like so many others do.


----------



## trivium nate

subscribed


----------



## Nocturin

Quote:


> Originally Posted by *CallsignVega*
> 
> I know, was floored when I saw the results too! Just remember though, most tests in the past have been with 1-2 cards max on a single monitor (and even in that limited scenario going from 8x to 16x PCI-E 2.0 gave a 2-5% increase). I have yet to see any PCI-E 2.0 vs 3.0 tests come anywhere near the demands of the PCI-E bus that my setup uses.


*puts foot in mouth*

Wow. I am astonished.

little "mini" reviews like yours are awesome. thanks again







. You finally got your smooth gameplay, after what 5 different tries? So is this monitor set-up staying for awhile or what ideas do you have for the future?

I wonder, what FPS can you get on one maxed monitor?


----------



## TheBadBull

Quote:


> Originally Posted by *Nocturin*
> 
> *puts foot in mouth*
> 
> Wow. I am astonished.


Pics or it didn't happen.


----------



## 161029

Subbed.


----------



## CallsignVega

Quote:


> Originally Posted by *Methodical*
> 
> Hey Vega, I just checked out the videos and I have one question. If you did not make the changes to save the 300mb of VRAM how would that have affected the fps and smoothness of the game as it would've been hitting the VRAM limits? Other words, what issues, if any, did you experience or would've experience had you been hitting the VRAM cap with 3.0 enabled? On the flip side would those changes to save those VRAM mb have saved the game's performance while 2.0 was enabled?
> Also, can you post your BF3 settings or pm them to me. I noticed you had some settings low, such as texture decorations or something like that. I ask because I am not familiar with all those different settings and I just maxed everything with the 2x 680s. I don't think my settings hurt me much since I am only on a single 1920x1200 monitor though, but curious about your settings.
> Can you post what settings you changed to get back those 300mbs of vram? I could not catch it or see what you did on the video.
> Btw, nice job on visually showing the affects to back up the graphs so that we can see exactly what you speak of. Visual is always better, at least for me.
> Thanks...Al


All you do is right-click on your games .exe file and check "disable visual themes and desktop composition" and hit OK. Win Aero tries to reserve desktop graphical items in VRAM and it kills your max allowable VRAM usage for your games. I saw 300-400MB which is a LOT when you are multi-display gaming close to the VRAM limit.

If I hit the VRAM limit with PCI-E 3.0 the results would be the same, drastic slowdowns and the game would still be unplayable just like in the video. I was hundreds of MB under the VRAM limit on the PCI-E 2.0 tests and hitting the VRAM limit is not the cause of the incredible slow down. If that where the case, you would see the slow-down also on the PCI-E 3.0 test as everything is identical besides bus speed.

Everything in BF3 is on Ultra besides the following settings: Decoration Low - as that just adds additional junk in the world and harder to see enemies, Effects - low as in some seriously huge firefights with explosions all over the place I like the least amount of slow-down and the least amount of smoke in my way, MSAA - off as FXAA does a great job and MSAA takes a performance hit for barely any return with FXAA already on, Motion blur - off of course, I don't know why anyone in their right mind would want this on, and HBAO is always on I just turned it off in the video to demonstrate the VRAM differences at the highest resolution and VRAM usage with Aero disabled etc.
Quote:


> Originally Posted by *Nocturin*
> 
> *puts foot in mouth*
> Wow. I am astonished.
> little "mini" reviews like yours are awesome. thanks again
> 
> 
> 
> 
> 
> 
> 
> . You finally got your smooth gameplay, after what 5 different tries? So is this monitor set-up staying for awhile or what ideas do you have for the future?
> I wonder, what FPS can you get on one maxed monitor?


Ya, the CRT's are so great for gaming the only thing that will be able to replace them is OLED or Sony Crystal direct-view LED's screens. So in short I might have this setup for a while!









On another note, I posted my Surround technical problem in another thread if anyone has a similar problem:

http://www.overclock.net/t/1240348/gtx-680-surround-mode-problem-help-technical-support


----------



## Methodical

Vega, thanks for the response and info - much obliged.


----------



## psyside

Amazing...


----------



## Sylon

You sir, just motivated me for school lol. Maybe one day...


----------



## SLK

Nice vids Vega. I have a question for you, if I were to purchase another 680 to do SLI with a monitor that is 2560x1600. Would I hit the PCI-e 2.0 bandwidth limit?


----------



## CallsignVega

Quote:


> Originally Posted by *SLK*
> 
> Nice vids Vega. I have a question for you, if I were to purchase another 680 to do SLI with a monitor that is 2560x1600. Would I hit the PCI-e 2.0 bandwidth limit?


You already have a gen 3 board, I would just toss a 3770K in there and you will be good. You would maybe see 5-10% increase at the most on a single monitor.


----------



## CallsignVega

Nice, glad to see someone else backing up my tests. Although the difference isn't as dramatic as mine due to a lower resolutions and card count, they are still very impressive differences.


















From user: psikeiro.


----------



## Kortwa

Very sizable increases in fps. And as always a pleasure to read through your logs Vega







Keep up the awesome work.


----------



## Nocturin

Quote:


> Originally Posted by *CallsignVega*
> 
> Nice, glad to see someone else backing up my tests. Although the difference isn't as dramatic as mine due to a lower resolutions and card count, they are still very impressive differences.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> From user: psikeiro.


linkY?

I can't expand that photo or zoom in without re-sizing my browser zoom







.


----------



## axipher

Quote:


> Originally Posted by *Nocturin*
> 
> Quote:
> 
> 
> 
> Originally Posted by *CallsignVega*
> 
> Nice, glad to see someone else backing up my tests. Although the difference isn't as dramatic as mine due to a lower resolutions and card count, they are still very impressive differences.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> From user: psikeiro.
> 
> 
> 
> linkY?
> 
> I can't expand that photo or zoom in without re-sizing my browser zoom
> 
> 
> 
> 
> 
> 
> 
> .
Click to expand...

Open the image in a new tab and it will be full screen:

http://cdn.overclock.net/1/10/10b499c8_DIFFERENCE.jpeg


----------



## PoopaScoopa

Quote:


> Originally Posted by *CallsignVega*
> 
> Nice, glad to see someone else backing up my tests. Although the difference isn't as dramatic as mine due to a lower resolutions and card count, they are still very impressive differences.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> From user: psikeiro.


From 32 to 53 minimum fps on triple 1080p. Were all three monitors plugged into the first card by any chance? Seems a little high for 8GB/s vs 16GB/s unless it was transferring all frame buffer from card two to card one. He's still going to have two monitors on one card even if he splits it up I guess.

It's too bad we can't test three lanes in x16/x16/x16 3.0 to compare with the current x16/x8/x16 3.0 to see if x8 3.0 really is enough as long as each card has its own RAMDAC. AMD eyefinity is just looking worse and worse running everything off one card + the CF bridge bottleneck.


----------



## Methos07

Wait Vega is going to keep a setup for a while? boring









You're supposed to change your entire computer on a monthly basis for my morning entertainment.


----------



## 161029

You are difficult to please. How about a woman?


----------



## armartins

By earlier posts woman are changed on a monthly basis also lol... and If he finds "the woman" probably we will see much less epic builds from his part lol. Poopa could you shed some light were I could read more about the bottleneck effect while running all monitors (as needed) from a single card in eyefinitity + crossfire. I'm really excited for my major uptadate that's scheduled from now to the next six months going from 40nm->28nm on GPU and from 32nm->22nm on CPU. On the GPU area I'm still in doubt which route to go. As it's looking like 2gb are a no go for a future proof PoV since I'll already be at 28nm I can just add another 3gb 7970 down the road (eager to read more about this possible bottleneck), on the other side the games I play Nvidia seems to always be ahead in performance. Man how I wish there was a reasonable priced 680 or whatever name they call with 3 DP, for no tearing, ability to disable this "boost" piece of **** and a healthy 3gb+ framebuffer, if it was a full GK110 I would pay $750 for such a SINGLE GPU card in a heartbeat. This would result in total awesomeness in Blizzard games with no tearing in 3x portrait (games in which Nvidia historically has better results). And would have decent enough performance for some high settings in BF3 @60hz... dreams, dreams... honestly 680 as is more and more looking like a mid card sold as high end while the mack daddy GK110 isn't home yet.


----------



## PoopaScoopa

Quote:


> Originally Posted by *armartins*
> 
> By earlier posts woman are changed on a monthly basis also lol... and If he finds "the woman" probably we will see much less epic builds from his part lol. Poopa could you shed some light were I could read more about the bottleneck effect while running all monitors (as needed) from a single card in eyefinitity + crossfire. I'm really excited for my major uptadate that's scheduled from now to the next six months going from 40nm->28nm on GPU and from 32nm->22nm on CPU. On the GPU area I'm still in doubt which route to go. As it's looking like 2gb are a no go for a future proof PoV since I'll already be at 28nm I can just add another 3gb 7970 down the road (eager to read more about this possible bottleneck), on the other side the games I play Nvidia seems to always be ahead in performance. Man how I wish there was a reasonable priced 680 or whatever name they call with 3 DP, for no tearing, ability to disable this "boost" piece of **** and a healthy 3gb+ framebuffer, if it was a full GK110 I would pay $750 for such a SINGLE GPU card in a heartbeat. This would result in total awesomeness in Blizzard games with no tearing in 3x portrait (games in which Nvidia historically has better results). And would have decent enough performance for some high settings in BF3 @60hz... dreams, dreams... honestly 680 as is more and more looking like a mid card sold as high end while the mack daddy GK110 isn't home yet.


The CF bridge is limited to 900MB/s in serial where as the SLI bridge is duplex and allows each card to talk to each other directly, without having to pass-through another card to get there(meaningful for 3/4-way). With AMD, you have to run everything off the first card, so if you're using 3, or even worse, 4-way, card two's bridge to card one is carrying the load of card four and three as well as itself. Allowing each monitor to have its own card reduces frame buffer transferring between PCie lanes and the bridge.

Even in 2-way, the CF bridge can only transmit in one direction at a time which adds to micro-stutter It's not just about bandwidth issues, the serial connection where card one is controlling all three other cards giving them instructions and the other cards want to communicate back with card one but has to wait or send via PCIe lane causes delays in frame times.

There'll be 4GB non-ref models that will allow much better overclocking in another month or two. Not sure which ones will have 3 DP connectors. Supposedly the 685 will be out in Sept.


----------



## armartins

One of the main reasons I've upgraded my 3 2209WAs to 3 U2412Ms was the existence of DPs on the later. Yes they also have DL-DVI, but the problem is that I'll stay on a single card for a while and after my experience with my 5970 in terms of stuttering and some kind of tearing that was present in some situations I've moved to a single 6970 DCII along with the new monitors with no regret. WoW would only recognize my eyefinity resolution in windowed fullscreen mode (blizzard's problem) and in that mode only a single card works anyways with ATI (another reason to go Nvidia, for future proof performance down the road). My main concern is to not mix DVI with DP and that's not going to happen in a non SLI Nvidia setup, and definitely I'm not confident about spending $1000+ on 2 680s ATM for 3 DVI outputs even with 4GB I can foresee what happened with the 280s prices when 285s came out as soon as GK110 hit the shelves. And I was also burn before at 8800GT(G92) launch.


----------



## zdude

update? or some other ridiculously expensive thing?


----------



## CallsignVega

I forgot how much work it is to get these large water setups done properly.









Assembly line done.










X79 chip-set, RAM and MOSFET/VRM blocks installed.










Setting up GPU's.



















One of the GPU's temperature is a lot higher than the other three. It is quite strange as I assembled all blocks exactly the same, with the same TIM and quantity. So far the good ones are idling at 25-27C and under load around high 30's C. Not looking forward to draining and re-doing card #2. Hopefully I don't have a bad block.


----------



## PatrickCrowely

Quote:


> Originally Posted by *CallsignVega*
> 
> I forgot how much work it is to get these large water setups done properly.
> 
> 
> 
> 
> 
> 
> 
> 
> Assembly line done.
> 
> 
> 
> 
> 
> 
> 
> 
> X79 chip-set, RAM and MOSFET/VRM blocks installed.
> 
> 
> 
> 
> 
> 
> 
> 
> Setting up GPU's.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> One of the GPU's temperature is a lot higher than the other three. It is quite strange as I assembled all blocks exactly the same, with the same TIM and quantity. So far the good ones are idling at 25-27C and under load around high 30's C. Not looking forward to draining and re-doing card #2. Hopefully I don't have a bad block.


Great setup Man, nice Youtube channel (subbed). Very curious to see what this setup will do on water. What are the name/model of those Ram Blocks?
EDIT:
Well I got to 3rd video & You answered my questions about the Ram. WOW!


----------



## ispano

Instant subscribe after seeing those pics above.


----------



## Elloquin

Quote:


> Originally Posted by *CallsignVega*
> 
> I forgot how much work it is to get these large water setups done properly.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> One of the GPU's temperature is a lot higher than the other three. It is quite strange as I assembled all blocks exactly the same, with the same TIM and quantity. So far the good ones are idling at 25-27C and under load around high 30's C. Not looking forward to draining and re-doing card #2. Hopefully I don't have a bad block.


Before you get too drastic depending on card manufacturer check to see that all 4 cards are using the same bios. I had 3 x EVGA gtx 285's at one point and one was running the "super clocked" bios even tho they were all basic vanilla reference cards.


----------



## Malcolm

I need to visit this section more often. Subbed.


----------



## CallsignVega

Quote:


> Originally Posted by *Elloquin*
> 
> Before you get too drastic depending on card manufacturer check to see that all 4 cards are using the same bios. I had 3 x EVGA gtx 285's at one point and one was running the "super clocked" bios even tho they were all basic vanilla reference cards.


It was a stand-off that wasn't fully seated from factory on the high temp GPU. Corrected and now all GPU's idle 21-25 C and full overclock load 33-38 C. All button up and ready to roar:


----------



## Mhill2029

Regarding your memory, have you encounterd any issue with using high freq memory? I was considering the Team Xtreem 2600Mhz set, but i was under the impression most 3930K/60X's have trouble due to excessive IMC strain.


----------



## CallsignVega

Quote:


> Originally Posted by *Mhill2029*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Regarding your memory, have you encounterd any issue with using high freq memory? I was considering the Team Xtreem 2600Mhz set, but i was under the impression most 3930K/60X's have trouble due to excessive IMC strain.


Honestly I haven't had time to push them yet. Currently running 2500 MHZ at 9-11-11-28 and they are doing fine.









I will post when I find their limits.


----------



## Methos07

What game is possibly worthy of that setup? lol.


----------



## Nocturin

i believe what started the whole thing was flight simulators, but i only started to pay attention to vegas builds when he made the custom "phase" chiller, so that's what I remember seeing the earliest.


----------



## CallsignVega

Quote:


> Originally Posted by *Methos07*
> 
> What game is possibly worthy of that setup? lol.


I pre-ordered the collectors editions of Guild Wars 2 and Diablo III. Add in a splash of BF3, Counter-Strike: GO and I still haven't played through Skyrim and Witcher 2 yet. Those titles should keep me pretty busy for the rest of the year.


----------



## Methos07

Quote:


> Originally Posted by *CallsignVega*
> 
> I pre-ordered the collectors editions of Guild Wars 2 and Diablo III. Add in a splash of BF3, Counter-Strike: GO and I still haven't played through Skyrim and Witcher 2 yet. Those titles should keep me pretty busy for the rest of the year.


I approve. Thanks for your service, btw.


----------



## Nocturin

Quote:


> Originally Posted by *CallsignVega*
> 
> I pre-ordered the collectors editions of Guild Wars 2 and Diablo III. Add in a splash of BF3, Counter-Strike: GO and I still haven't played through Skyrim and Witcher 2 yet. Those titles should keep me pretty busy for the rest of the year.


whats your bf3 name? A bunch of us from OCN play frequently and it would be awesome to get you to play with us. We have about 4-6 regulars now.


----------



## CallsignVega

Quote:


> Originally Posted by *Methos07*
> 
> I approve. Thanks for your service, btw.


Your welcome.
Quote:


> Originally Posted by *Nocturin*
> 
> whats your bf3 name? A bunch of us from OCN play frequently and it would be awesome to get you to play with us. We have about 4-6 regulars now.


Name is Zybaine. Ya look me up, I haven't played hardly any in the last couple of weeks due to this build but should be good to go soon.


----------



## dr/owned

My head hurts from reading 43 pages in one sitting. (I'm planning on doing a similar build in August minus the geotherm cooling loop and crt's)

Questions:

What is the brand of stainless steel desk you're using? I want something really solid so I'm deciding between glass, metal (buy aluminum plate and build some legs), or hard epoxy resin (high school science tables...ridiculously strong).
I've heard that going 4x sli can actually decrease fps numbers and cause extra incompatibilities over 3x sli. Comment?
Have you gotten the water chiller condensation issue solved (desiccant vs CO2 vs ??) ? Rather than trying to remove humidity or insulate against condensation I built a power controller that will calculate the dew point on the fly and keep the temperature of the coolant a few degrees above this. I have no idea how overclocks are going to be affected with sub ambient water vs sub zero so I don't know if it's even worth it to run a chiller if it's not going sub zero. If I do go sub zero I'd like to hear what other people have done besides stuffing a ton of crap in the case and/or gumming it up with dielectric grease.
Why not 4x SSD's with an LSI raid controller?


----------



## Couch Potato

Sub'd


----------



## CallsignVega

Quote:


> Originally Posted by *dr/owned*
> 
> My head hurts from reading 43 pages in one sitting. (I'm planning on doing a similar build in August minus the geotherm cooling loop and crt's)
> Questions:
> 
> What is the brand of stainless steel desk you're using? I want something really solid so I'm deciding between glass, metal (buy aluminum plate and build some legs), or hard epoxy resin (high school science tables...ridiculously strong).
> I've heard that going 4x sli can actually decrease fps numbers and cause extra incompatibilities over 3x sli. Comment?
> Have you gotten the water chiller condensation issue solved (desiccant vs CO2 vs ??) ? Rather than trying to remove humidity or insulate against condensation I built a power controller that will calculate the dew point on the fly and keep the temperature of the coolant a few degrees above this. I have no idea how overclocks are going to be affected with sub ambient water vs sub zero so I don't know if it's even worth it to run a chiller if it's not going sub zero. If I do go sub zero I'd like to hear what other people have done besides stuffing a ton of crap in the case and/or gumming it up with dielectric grease.
> Why not 4x SSD's with an LSI raid controller?


1: It's a stainless steel industrial table top I bought from Advance Tabco. This thing doesn't even flinch with all the weight I have on it and it's USA made baby!

2: Maybe on single screen but I have had great success with 4-way SLI in Surround mode. A high five! I plan on doing GTX 680 SLI scaling tests sometime in the future but here are my 3GB GTX 580's with a 990X:










3: Haven't gotten to that yet. That is a whole animal in and of itself! Proper condensation protection is quite difficult to do properly if going below the dew point. Are the small gains justifiable versus chilled water above the dew point? I am not so sure anymore. Especially here even on my ambient loop the temps are pretty darn low.

4: There is a point of overkill. Most say that RAID 0 on SSD's is way overkill versus 1 SSD. Two load up games extremely quick in RAID 0 and I don't think I'd see much of a speed boost with getting two more. Then of course your RAID 0 array get's more unreliable every-time you add another drive.

On another note:

I've had success getting a custom resolution and frequency working in Surround! Running 3600x1877 @ 95Hz. That corrects for a perfect GW900 aspect ratio for each screen, increases the Hz a nice bit above stock settings and the whole display setup us almost a 2:1 width/height ratio which is just about perfect.

I am running these settings right now and they seem to be doing pretty well:

3960X @ 5 GHz @ 1.47v.
105 MHz Base-clock for a little more PCI-E "oomph".
2240 MHz @ 9-11-11-28 on the 16GB RAM. May try and get those timing a bit lower in the future.

Idle @ 28 C and game load 45-55 C. Prime95 full load temps are ~57 C. I also purchased the Intel overclock warranty for $35, small price I think for some piece of mind and they cross ship if you let them put a reserve amount on your credit card.


----------



## dr/owned

Quote:


> Originally Posted by *CallsignVega*
> 
> 3: Haven't gotten to that yet. That is a whole animal in and of itself! Proper condensation protection is quite difficult to do properly if going below the dew point. Are the small gains justifiable versus chilled water above the dew point? I am not so sure anymore. Especially here even on my ambient loop the temps are pretty darn low.


Thank you for the reply. If you decide on doing CO2 flooding then let me know. Some in-the-shower thinking is telling me it might be a good idea to do some sort of polycarb/acrylic bell jar system where you put the entire desktop inside of it (so you don't have to glue your pc case shut) and then install a CO2 sensor like this with a custom controller. A bit more precise than using a match to gauge fill level. I'm an electrical engineer so this stuff interests the hell out of me.


----------



## CallsignVega

Quote:


> Originally Posted by *dr/owned*
> 
> Thank you for the reply. If you decide on doing CO2 flooding then let me know. Some in-the-shower thinking is telling me it might be a good idea to do some sort of polycarb/acrylic bell jar system where you put the entire desktop inside of it (so you don't have to glue your pc case shut) and then install a CO2 sensor like this with a custom controller. A bit more precise than using a match to gauge fill level. I'm an electrical engineer so this stuff interests the hell out of me.


Cool. Ya it will be interesting to bounce some ideas off of you. I don't want to do a conform-ant sealant again. Are you waiting for "big Kepler" for your build? I'd imagine when that hit's I'll just swap out the four cards and be ready to get again.


----------



## dr/owned

Quote:


> Originally Posted by *CallsignVega*
> 
> Cool. Ya it will be interesting to bounce some ideas off of you. I don't want to do a conform-ant sealant again. Are you waiting for "big Kepler" for your build? I'd imagine when that hit's I'll just swap out the four cards and be ready to get again.


August just seems like the best time for a lot of things to shake themselves loose. GK110 or 4 GB 680's , Intel 530/720 series SSD's, 3980X, less hdd price gouging. Plus by that time IB will be thoroughly benched and enthusiast level Z77 boards with quad-sli support will abound should I go that way.

Plus in August I'll be able to take advantage of Intel employee pricing


----------



## JedixJarf

Quote:


> Originally Posted by *CallsignVega*
> 
> Cool. Ya it will be interesting to bounce some ideas off of you. I don't want to do a conform-ant sealant again. Are you waiting for "big Kepler" for your build? I'd imagine when that hit's I'll just swap out the four cards and be ready to get again.


Good, wouldn't want to wait too long to see you redo your build again


----------



## nvidiaftw12

Quote:


> Originally Posted by *dr/owned*
> 
> What is the brand of stainless steel desk you're using? I want something really solid so I'm deciding between glass, metal (buy aluminum plate and build some legs), or *hard epoxy resin (high school science tables...ridiculously strong)*.


My dad has one of those. I get it when I move into my new room.


----------



## Hydroplane

Quote:


> Originally Posted by *CallsignVega*
> 
> It was a stand-off that wasn't fully seated from factory on the high temp GPU. Corrected and now all GPU's idle 21-25 C and full overclock load 33-38 C.


Wow those temps are low considering the amount of stuff you're cooling with the Mo Ra 9x140. It seems to be doing a great job. What frequencies are you running the 680s at?


----------



## CallsignVega

On my 4-way GTX 680 setup using EVGA reference cards with EK water blocks, my stable overclock is 1202 MHz core and 3534 MHz memory. Pretty much spent all day testing those clocks using Heaven 3.0, BF3, Crysis 2, Metro 2033 and Skyrim with some mods and HD texture pack would crash at the lowest frequency. So modded Skyrim set my ceiling. I found though that performance scales linearly with increased memory clocks and there is no "sweet spot". Get that memory frequency as high as it can go!

Those number are still pretty good considering 4-Way SLI is usually a bit harder to highly overclock. 4-way 680 just destroys Skyrim in Surround:

http://www.youtube.com/watch?v=FvShhfWpk2M&context=C41467ecADvjVQa1PpcFOpwHlB70YlOALsTk3pZxTxt4tbGt5H9Yk=

I am just astounded by how smooth the GTX 680 handles Surround using PCI-E 3.0 slots. There is no micro-stuttering, pauses or jitters. I don't think I could tell the difference between a single GPU running a single monitor and this 4-Way SLI Surround setup. Combine that with the CRT's and this is by far the smoothest and best Surround/Eyefinity setup of mine to date.









It will be interesting to see if any of these new EVGA non-reference cards will allow the voltage to rise above software 1.175 (I think really the only way to get more clocks out of these and just not more power phases).


----------



## Cavi Mike

I love the neoprene hoses and real hose clamps. Strictly business.


----------



## stu.

Just watched that latest video, and I had to change my pants. Twice.


----------



## CallsignVega

Quote:


> Originally Posted by *Cavi Mike*
> 
> I love the neoprene hoses and real hose clamps. Strictly business.











Quote:


> Originally Posted by *stu.*
> 
> Just watched that latest video, and I had to change my pants. Twice.


lol

The great news is that with driver 301.25 I no longer have the Surround setup not working on cold boot-up problem. Everything is working great now in Surround mode.

I've noticed that Witcher 2 has a problem when I load up the game, instead of it filling the screen, it's just a narrow landscape type view into the world. Anyone familiar with Surround/Eyefinity in Witcher 2 know how to fix that?


----------



## Op125

Vega asked me if I could post some pictures when I got done, and I thought I might as well just share it with all of you. And since everyone in this thread seems to be high on eyefinity I thought you might wanna see this

What I've basically done is that I have copied Vega's 3x 120hz no bezel setup;

And I wanna give a big thanks to Vega for his help and all













@Vega
I gave up trying to mount the base intact. The shape and weight of the thing made it more difficult than to take it apart.


----------



## CallsignVega

Quote:


> Originally Posted by *Op125*
> 
> Vega asked me if I could post some pictures when I got done, and I thought I might as well just share it with all of you. And since everyone in this thread seems to be high on eyefinity I thought you might wanna see this
> What I've basically done is that I have copied Vega's 3x 120hz no bezel setup;
> And I wanna give a big thanks to Vega for his help and all
> 
> 
> 
> 
> 
> 
> 
> 
> 
> @Vega
> I gave up trying to mount the base intact. The shape and weight of the thing made it more difficult than to take it apart.


Look's sweet! It's pretty awesome to game on isn't it? That is my favorite LCD Surround/Eyefinity setup. It looks like a mirror image of mine.









Haha I told you that base would be a PITA to try and mount on the back without taking it apart.









You shimmed up the monitor mount clamps a bit to get everything lined up right? I had to do the same thing. Look's like you have the little fan pointed up at the PCB's too. Glad it worked out so well.


----------



## Op125

Quote:


> Originally Posted by *CallsignVega*
> 
> You shimmed up the monitor mount clamps a bit to get everything lined up right? I had to do the same thing.


Ye, but since I'm quite new to multiple monitor setups it took me quite a while before I figured out why the monitors didn't line up correctly. But it worked out in the end


----------



## dr/owned

Quote:


> Originally Posted by *CallsignVega*
> 
> It will be interesting to see if any of these new EVGA non-reference cards will allow the voltage to rise above software 1.175 (I think really the only way to get more clocks out of these and just not more power phases).


According to TiN's work on boosting the volts on a 680 it's a pretty simple process of connecting an extra wire to the pwm controller.


----------



## CallsignVega

Quote:


> Originally Posted by *dr/owned*
> 
> According to TiN's work on boosting the volts on a 680 it's a pretty simple process of connecting an extra wire to the pwm controller.


Sounds interesting. Link?


----------



## dr/owned

Quote:


> Originally Posted by *CallsignVega*
> 
> Sounds interesting. Link?


Here ya go. He goes to the trouble of attaching a bunch of DIP switches to all 7 of the control pins so he can toggle them on the fly. Consulting the datasheet for the controller (image below) 1.175 V corresponds to hex 46 = binary 01000110 . VID7 = 0, VID6 = 1, VID5 = 0. VID4 = 0, VID3 = 0, VID2 = 1, VID1 = 1. Connect VID2 and VID1 to VID3 (or find a spot on the PCB that is ground...I'd do this since it's easy to find ground on a PCB) and you end up with 0100000 = hex 40 = 1.2125 V . It'd be a fun experiment to do especially if you already have an extra 680 lying around.


----------



## CallsignVega

Quote:


> Originally Posted by *dr/owned*
> 
> Here ya go. He goes to the trouble of attaching a bunch of DIP switches to all 7 of the control pins so he can toggle them on the fly. Consulting the datasheet for the controller (image below) 1.175 V corresponds to hex 46 = binary 01000110 . VID7 = 0, VID6 = 1, VID5 = 0. VID4 = 0, VID3 = 0, VID2 = 1, VID1 = 1. Connect VID2 and VID1 to VID3 (or find a spot on the PCB that is ground...I'd do this since it's easy to find ground on a PCB) and you end up with 0100000 = hex 40 = 1.2125 V . It'd be a fun experiment to do especially if you already have an extra 680 lying around.
> http://www.overclock.net/content/type/61/id/829920/width/583/height/608/flags/


Hm, interesting. Although, these four GTX 680's are chewing up games pretty easy at 1200 MHz even at high resolution surround!


----------



## PatrickCrowely

Quote:


> Originally Posted by *Op125*
> 
> Vega asked me if I could post some pictures when I got done, and I thought I might as well just share it with all of you. And since everyone in this thread seems to be high on eyefinity I thought you might wanna see this
> What I've basically done is that I have copied Vega's 3x 120hz no bezel setup;
> And I wanna give a big thanks to Vega for his help and all


That is Nice, great job. Took that engineering Gene....--->


----------



## WolfssFang

Im so confused on how the 3 monitors work with the Fresnel Lens on them.


----------



## XPC

Fresnel Lenses basically magnify whatever is in front of them. The fresnels have no bezels, and are placed right next to one another at a perfect angle and distance from the monitor screens, so when you sit at the proper viewing angle, all 3 magnified images merge into one (relatively) seamless image.


----------



## Nocturin

Nice! I can't wait until I can have toys like you guys.


----------



## feteru

this is ridiculous. I do have a question though, do you know what the performance numbers would be like for a single 680 on [email protected] I am looking into the new Catleaps from the buy here for my upcoming build, and I was wondering what sort of frames I would get in Skyrim, BF3, and mayhaps Crysis with a single 2GB 680. AA would not be neccesary, if you need to know. Thanks, and hope you sort out your condensation issues (with the phase change) so you can get this system really crunching.


----------



## CallsignVega

Quote:


> Originally Posted by *feteru*
> 
> this is ridiculous. I do have a question though, do you know what the performance numbers would be like for a single 680 on [email protected] I am looking into the new Catleaps from the buy here for my upcoming build, and I was wondering what sort of frames I would get in Skyrim, BF3, and mayhaps Crysis with a single 2GB 680. AA would not be neccesary, if you need to know. Thanks, and hope you sort out your condensation issues (with the phase change) so you can get this system really crunching.


I'd go with two g80's honestly to keep the FPS closer to 100Hz. You'd have to turn down the settings quite a bit to do that on one 680.


----------



## feteru

Quote:


> Originally Posted by *CallsignVega*
> 
> I'd go with two g80's honestly to keep the FPS closer to 100Hz. You'd have to turn down the settings quite a bit to do that on one 680.


My wallet suddenly cries out in pain and jumps off of my desk. Also, do you think I would have any bottleneck while gaming with a 3820? I don't think I should, if I OC it to around 4.5, but I was just wondering.


----------



## CallsignVega

Quote:


> Originally Posted by *feteru*
> 
> My wallet suddenly cries out in pain and jumps off of my desk. Also, do you think I would have any bottleneck while gaming with a 3820? I don't think I should, if I OC it to around 4.5, but I was just wondering.


No, 3820 plus 2x 680's will be perfect.


----------



## Retell

You have invented a new form of eye bleach, I am subbed so hard.


----------



## covert ash

Quote:


> Originally Posted by *CallsignVega*
> 
> No, 3820 plus 2x 680's will be perfect.


What are your thoughts on a 3820 for three or four 680's? Or would that be 3930K territory?


----------



## feteru

Also, is there a noticeable difference between the 3820 and the 3930k? Is it just that the 3930k is Hexa-core, or is there an increase in IPC/binning?


----------



## Nocturin

Quote:


> Originally Posted by *covert ash*
> 
> What are your thoughts on a 3820 for three or four 680's? Or would that be 3930K territory?


If your going to spend 1k+ on GPUs, why would you skimp on anything less than a 3930k?
Quote:


> Originally Posted by *feteru*
> 
> Also, is there a noticeable difference between the 3820 and the 3930k? Is it just that the 3930k is Hexa-core, or is there an increase in IPC/binning?


Yes there is a noticeable difference between a 4 core, 8 thread cpu and a 6 core 12 thread cpu. Binning differences come into play with the 3930k and the 3960x

This isn't the right place to ask questions like these guys, there's an entire forum devoted to Intel CPU + nvidia gpu questions


----------



## vikingsteve

is there anything Vega can't do? honest question. it seems like he's tried just about everything when it comes to high-end parts


----------



## feteru

Quote:


> Originally Posted by *Nocturin*
> 
> If your going to spend 1k+ on GPUs, why would you skimp on anything less than a 3930k?
> Yes there is a noticeable difference between a 4 core, 8 thread cpu and a 6 core 12 thread cpu. Binning differences come into play with the 3930k and the 3960x
> 
> This isn't the right place to ask questions like these guys, there's an entire forum devoted to Intel CPU + nvidia gpu questions


Sorry, I was just wondering, as it seems Vega would know. Thanks for the answers though!


----------



## Nocturin

Quote:


> Originally Posted by *feteru*
> 
> Sorry, I was just wondering, as it seems Vega would know. Thanks for the answers though!


I'm sure he doesn't mind, but that way you guys get the exposure to your questions rather than just the people subbed







.

and your welcome


----------



## l88bastar

Quote:


> Originally Posted by *Nocturin*
> 
> I'm sure he doesn't mind, but that way you guys get the exposure to your questions rather than just the people subbed
> 
> 
> 
> 
> 
> 
> 
> .
> and your welcome


On the topic of asking Vega offtopic questions.... "Vega am I going to hell for applying a custom American Flag paint job to my Ferrari in TDU2?


----------



## Nocturin

I dont know wheather to puke or hum the team america song...

what's TDU2?


----------



## zdude

Quote:


> Originally Posted by *Nocturin*
> 
> I dont know wheather to puke or hum the team america song...
> what's TDU2?


+1


----------



## covert ash

Quote:


> Originally Posted by *Nocturin*
> 
> I dont know wheather to puke or hum the team america song...
> what's TDU2?


Test Drive Unlimited 2.


----------



## dr/owned

Hang on let me check.

Yep the snowball I left in hell hasn't melted and it is possible to make a 458 look ugly


----------



## CallsignVega

Quote:


> Originally Posted by *l88bastar*
> 
> On the topic of asking Vega offtopic questions.... "Vega am I going to hell for applying a custom American Flag paint job to my Ferrari in TDU2?


LOL! Reminds me of colorful versions of the random geometric shapes they used to paint on the side of war ships in WWII in order to breakup the ship's outline on the horizon.


----------



## CallsignVega

Anyone see this crash before:










I'm getting that in all games now. Pretty much have tried everything, set system overclock to default, removed precision X and ran GPU's at default, default NVIDIA control panel options, ran 4-way SLI, 1 GPU, Surround Mode, Single display mode, tried drivers: 301.10, 301.24, 301.25, PCI-E 2.0 mode, PCI-E 3.0 mode. They all crash. Wonder if those stupid multiple Direct X installs that Steam does every time a new game is installed screwed something up. Gah I don't feel like re-installing windows again.


----------



## roadlesstraveled

I'm getting that same error in BF3 (only game I currently play) with my 460's and I've tried several different drivers (with clean installs). One night everything was working fine and then all of a sudden I started getting that error. I haven't been able to figure out what changed in my setup.

I've tried disabling SLI, setting my system OC back to default, disabling just about every program on startup, etc... My GPU's have never been OC'd and the crash happens as soon as the game launches so the cards aren't even getting stressed.


----------



## Nocturin

I've had that error twice and a clean nvidia driver re-install took care of it for me right after they launched the 295 drivers. I haven't had it since.

I checked my bios afterwards and my RAM profile reset, I had to re-enable my XMP profile. Althought I do get random freezes/colored screens while playing BF3(about once every 2 weeks), but I chalk that up to the game and not my system.


----------



## CallsignVega

Ya, something weird is going on here. I thought at first this problem was Surround related, or 4-way SLI related. But then I went to single GPU and single monitor config and it's still happening! Bah.


----------



## feteru

Quote:


> Originally Posted by *CallsignVega*
> 
> Ya, something weird is going on here. I thought at first this problem was Surround related, or 4-way SLI related. But then I went to single GPU and single monitor config and it's still happening! Bah.


Awww man. Have you been having any issues with RSODs?


----------



## CallsignVega

Quote:


> Originally Posted by *feteru*
> 
> Awww man. Have you been having any issues with RSODs?


Nope, just the games crashing lol. I guess that is just as bad.


----------



## TA4K

Quote:


> Originally Posted by *CallsignVega*
> 
> Nope, just the games crashing lol. I guess that is just as bad.


play a racing game. When you crash, it will seem even more realistic.


----------



## nvidiaftw12




----------



## Citra

Quote:


> Originally Posted by *TA4K*
> 
> play a racing game. When you crash, it will seem even more realistic.


Ba dum tss.


----------



## feteru

Quote:


> Originally Posted by *nvidiaftw12*


I just cannot get enough of your avatar, it's so wonderful. I would say it's almost as good as iCrap's old one


----------



## CallsignVega

Been hunting down these random crashes over the last few days. Now one of the GTX 680's appears to have gone bad. Grapphical corruption all over that monitor and when it inserts it's frame into SLI on a single monitor config I can also see the corruption. So I'll be down to tri-SLI to send that back to EVGA for repair. Fun stuff.


----------



## zdude

I thought that you bought a fith 680 for a case like this.


----------



## CallsignVega

Someone offered me a good deal on the 5th so I got rid of it lol, bit me in the butt.









This was after the four in the machine were running fine. This is also the first GPU failure I've ever had.


----------



## CallsignVega

I've had this guy on the left build me a pair of his 3D glasses. They should be here on Tuesday. (click photo)


----------



## crust_cheese

What the...

can you elaborate upon what kind of 3D goggles those exactly are?


----------



## CallsignVega

Quote:


> Originally Posted by *crust_cheese*
> 
> What the...
> can you elaborate upon what kind of 3D goggles those exactly are?


Haha no idea, he just look's awesome.

I did order a pair of these to test out:

http://store.sony.com/webapp/wcs/stores/servlet/CategoryDisplay?catalogId=10551&storeId=10151&langId=-1&identifier=S_Personal_3DViewer#overview/theater


----------



## Nocturin

Holy asterisks, you have bottomless pockets.

Those are awesome!


----------



## feteru

Quote:


> Originally Posted by *CallsignVega*
> 
> Haha no idea, he just look's awesome.
> I did order a pair of these to test out:
> http://store.sony.com/webapp/wcs/stores/servlet/CategoryDisplay?catalogId=10551&storeId=10151&langId=-1&identifier=S_Personal_3DViewer#overview/theater


o.0 those look awesome. How much are they though? Also, can we have some pictures of the actual computer, and the room it is in, seeing as you just have the monitors hooked up into another room


----------



## TinDaDragon

How come I haven't subscribed to this?


----------



## CallsignVega

Quote:


> Originally Posted by *feteru*
> 
> o.0 those look awesome. How much are they though? Also, can we have some pictures of the actual computer, and the room it is in, seeing as you just have the monitors hooked up into another room


They are $800. Cross-talk, ghosting and glasses flicker is what has always turned me off with 3D. This supposedly eliminates all three of those problems. I thought it would be cool to test out. Plus it will be cool to see .7 inch OLED screens.

The thing that could kill it for me though is the 1280x720 resolution in each eye and if there is any input lag. We shall see.


----------



## rawfuls

Pretty sure at this point, your 'computer room' has blown past the military's simulation rooms?


----------



## CrazzyRussian

Keep on delivering Vega!


----------



## Nocturin

Quote:


> Originally Posted by *CallsignVega*
> 
> They are $800. Cross-talk, ghosting and glasses flicker is what has always turned me off with 3D. This supposedly eliminates all three of those problems. I thought it would be cool to test out. Plus it will be cool to see .7 inch OLED screens.
> The thing that could kill it for me though is the 1280x720 resolution in each eye and if there is any input lag. We shall see.


Well... the 720p I could deal with considering it's going to look like a '70 screen(depending how large the pixels are, that DPI man...







). Could be very immersive, or just plain tiring on the eyes. It should emilnate all crosstalk/ghosting because the correct image is shown on the right screen... but then you might notice other things related to 3D drivers.

Do you know if a "base station" handles all this and it's transmitted wirelessly or how the rest of the tech works? Essentially, it would seem that it's two seperate "monitors" and I don't know how 3D games/drivers would fare sending the image to two "monitors" instead of one 120hz one.

How are they connected anyways? I didn't see anything about that on the website you linked too.


----------



## Bastyn99

When hardware manufacturers say grace at dinner, they thank the good lord for CallsignVega.


----------



## feteru

Quote:


> Originally Posted by *Nocturin*
> 
> Well... the 720p I could deal with considering it's going to look like a '70 screen(depending how large the pixels are, that DPI man...
> 
> 
> 
> 
> 
> 
> 
> ). Could be very immersive, or just plain tiring on the eyes. It should emilnate all crosstalk/ghosting because the correct image is shown on the right screen... but then you might notice other things related to 3D drivers.
> Do you know if a "base station" handles all this and it's transmitted wirelessly or how the rest of the tech works? Essentially, it would seem that it's two seperate "monitors" and I don't know how 3D games/drivers would fare sending the image to two "monitors" instead of one 120hz one.
> How are they connected anyways? I didn't see anything about that on the website you linked too.


I think I saw on the website that they have a base station, which is linked up by HDMI to your computer/whatever. Not so sure though


----------



## CallsignVega

Quote:


> Originally Posted by *Bastyn99*
> 
> When hardware manufacturers say grace at dinner, they thank the good lord for CallsignVega.


lol
Quote:


> Originally Posted by *feteru*
> 
> I think I saw on the website that they have a base station, which is linked up by HDMI to your computer/whatever. Not so sure though


Ya, there is a controller box. Instead of sending two separate images to a single display and having shutter glasses shutter out the image destined for the other eye, this headset sends each applicable image to it's own display in each eye. Eliminates virtually all problems with 3D. If they made a 1080P version of this with Display port (required as HDMI is too slow) and a game-mode pass through to minimize input lag I'd easily pay $1500 for it. If they made a a 1440P version of this with all the above, I'd easily pay $3000-4000 for it.

Time to step it up Sony!


----------



## lagittaja

This is so EPIC! Just went through the whole thread in one sitting. Subbed!
Quote:


> Originally Posted by *CallsignVega*
> 
> Quote:
> 
> 
> 
> Originally Posted by *feteru*
> 
> Awww man. Have you been having any issues with RSODs?
> 
> 
> 
> Nope, just the games crashing lol. I guess that is just as bad.
Click to expand...

Quote:


> Originally Posted by *TA4K*
> 
> Quote:
> 
> 
> 
> Originally Posted by *CallsignVega*
> 
> Nope, just the games crashing lol. I guess that is just as bad.
> 
> 
> 
> play a racing game. When you crash, it will seem even more realistic.
Click to expand...

Quote:


> Originally Posted by *Citra*
> 
> Quote:
> 
> 
> 
> Originally Posted by *TA4K*
> 
> play a racing game. When you crash, it will seem even more realistic.
> 
> 
> 
> Ba dum tss.
Click to expand...

Today's best laughs, thank you very much


----------



## feteru

Quote:


> Originally Posted by *CallsignVega*
> 
> lol
> Ya, there is a controller box. Instead of sending two separate images to a single display and having shutter glasses shutter out the image destined for the other eye, this headset sends each applicable image to it's own display in each eye. Eliminates virtually all problems with 3D. If they made a 1080P version of this with Display port (required as HDMI is too slow) and a game-mode pass through to minimize input lag I'd easily pay $1500 for it. If they made a a 1440P version of this with all the above, I'd easily pay $3000-4000 for it.
> Time to step it up Sony!


Man, a 1440p display at less than 2 inches would be insane! If they could get the input lag low, then that would be really epic (maybe they could have each eye set up as a separate display, and then some custom drivers by sony? That could work, but might be buggy)
Also
Post Updates please!


----------



## TA4K

Quote:


> Originally Posted by *lagittaja*
> 
> This is so EPIC! Just went through the whole thread in one sitting. Subbed!
> Today's best laughs, thank you very much


Glad to be of assistance kind sir.


----------



## CallsignVega

Testing is done on the Sony HMZ-T1.

These thoughts are for computer gaming 3D only. I tested desktop, F1 2010, Metro 2033, Crysis 2, Starcraft 2, Witcher 2 all in 3D.

As I imagined, 720P is a super low resolution. I could see all of the pixels and in games there is tons of aliasing on objects, even with 8X AA enabled. Distant objects are super blurry as there is not enough resolution to resolve that. Especially noticeable in F1 2010 when racing and looking down the track, it was hard to make out anything because the resolution was so low.

The image size to me sitting at my desk here if I had a monitor in front of me would be about a 24". Definitely a lot smaller than I was thinking it was going to be. I guess you couldn't make it any larger as 720P is already being pushed to the max. The FW900 Fresnel Surround setup has like a million times more immersion.

There is input lag. A good three frames. Enough to annoy the hell out of me. Especially coming from something that has absolutely zero.

There is motion blur. I don't know if it is because of the low refresh rate (60 Hz) or not. It is better than LCD but it is still noticeable. It doesn't have the perfect crystal clear image of CRT in fast movements like scrolling around the Starcraft 2 map. Doing that reveals even the slightest motion bur.

No matter where I set the eye-span adjustments nor positioned the visor, the edges of the image were blurred. Even a slight movement of the head can effect that. This is one of my largest problems with the visor as you cannot get a clear image over the full surface no matter what you do.

The 3D effect was pretty decent, but not as "deep" as I've seen with something like say polarized glasses avatar in the theater. It was still pretty good though.

It look's like the the visor uses Frensel lenses. So there are chromatic abberations. If you view say a white mouse cursor on a black background, the cursor isn't all white .The lens splits up some of the colors and you can see red and blue hue's around objects just like the Fresnel in front of our FW900.

Comfort is pretty poor IMO. Ton's of weight right on the bridge of your nose. The little shade rubber flaps are secured very flimsy under your eyes and just fall off if you aren't careful. EVen reclining in a chair I wouldn't want to keep these on my head for more than a hour or two. Custom comfort mods are pretty much a must.

Build quality is pretty flimsy. If these fell off a 3-4 foot desk I would say they would break a decent percentage of the time.

On bright screens, I noticed a slight buzz sound in the left earpiece. Some sort of cross-talk interference.

The sound quality is about what you would expect from a $30 pair of headphones from Best Buy. Let's just say going from one of the best headphones in the world (HD650's) to these is quite a change.

They are definitely not as good as I expected and would firmly put them in the "toy" category.. To me it kinda feels like something Nintendo would make. I am not sure if I would even bother with a 1080P version. Maybe if they made a 1440P version and got rid of some of the input lag issues and the focus/abberation problem I'd give it a shot again.

So after four hours it's already packed up and ready to go back to Amazon!

Hopefully you like it better and can test it out in some movies.

Just for computer gaming, my FW900 Fresnel Surround setup is like a billion times better. The Fresnel lenses and large screen are awesome for immersion (Fresnel add's just a hint of depth feel), plus you get zero ghosting, zero blur, zero lag and literally 10 times the resolution. Kinda bummed about the Sony visor but it is what it is! The tech has promise but it falls too short for me at this time.


----------



## Mhill2029

$800............









these companies are making a killing on 3D gimmicky tech. Personally i can't see what all the fuss is about, and never have with 3D. It's all meh seen it all before for me. Sure tech has advanced dramatically since the days of watching Jaws in 3D with those pathetic specs of the 80's, but i just don't see the fascination with it that so many do. Maybe i'm fussy...

When glasses free TV's etc come to head, then maybe i'll start becoming interested in it again.


----------



## CallsignVega

Quote:


> Originally Posted by *Mhill2029*
> 
> $800............
> 
> 
> 
> 
> 
> 
> 
> 
> these companies are making a killing on 3D gimmicky tech. Personally i can't see what all the fuss is about, and never have with 3D. It's all meh seen it all before for me. Sure tech has advanced dramatically since the days of watching Jaws in 3D with those pathetic specs of the 80's, but i just don't see the fascination with it that so many do. Maybe i'm fussy...
> When glasses free TV's etc come to head, then maybe i'll start becoming interested in it again.


I agree, 3D is overrated. It's not worth the trade off's. 2D at 120 Hz is much better than 3D at 60 Hz .


----------



## Nocturin

Quote:


> Originally Posted by *CallsignVega*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Testing is done on the Sony HMZ-T1.
> These thoughts are for computer gaming 3D only. I tested desktop, F1 2010, Metro 2033, Crysis 2, Starcraft 2, Witcher 2 all in 3D.
> As I imagined, 720P is a super low resolution. I could see all of the pixels and in games there is tons of aliasing on objects, even with 8X AA enabled. Distant objects are super blurry as there is not enough resolution to resolve that. Especially noticeable in F1 2010 when racing and looking down the track, it was hard to make out anything because the resolution was so low.
> The image size to me sitting at my desk here if I had a monitor in front of me would be about a 24". Definitely a lot smaller than I was thinking it was going to be. I guess you couldn't make it any larger as 720P is already being pushed to the max. The FW900 Fresnel Surround setup has like a million times more immersion.
> There is input lag. A good three frames. Enough to annoy the hell out of me. Especially coming from something that has absolutely zero.
> There is motion blur. I don't know if it is because of the low refresh rate (60 Hz) or not. It is better than LCD but it is still noticeable. It doesn't have the perfect crystal clear image of CRT in fast movements like scrolling around the Starcraft 2 map. Doing that reveals even the slightest motion bur.
> No matter where I set the eye-span adjustments nor positioned the visor, the edges of the image were blurred. Even a slight movement of the head can effect that. This is one of my largest problems with the visor as you cannot get a clear image over the full surface no matter what you do.
> The 3D effect was pretty decent, but not as "deep" as I've seen with something like say polarized glasses avatar in the theater. It was still pretty good though.
> It look's like the the visor uses Frensel lenses. So there are chromatic abberations. If you view say a white mouse cursor on a black background, the cursor isn't all white .The lens splits up some of the colors and you can see red and blue hue's around objects just like the Fresnel in front of our FW900.
> Comfort is pretty poor IMO. Ton's of weight right on the bridge of your nose. The little shade rubber flaps are secured very flimsy under your eyes and just fall off if you aren't careful. EVen reclining in a chair I wouldn't want to keep these on my head for more than a hour or two. Custom comfort mods are pretty much a must.
> Build quality is pretty flimsy. If these fell off a 3-4 foot desk I would say they would break a decent percentage of the time.
> On bright screens, I noticed a slight buzz sound in the left earpiece. Some sort of cross-talk interference.
> The sound quality is about what you would expect from a $30 pair of headphones from Best Buy. Let's just say going from one of the best headphones in the world (HD650's) to these is quite a change.
> They are definitely not as good as I expected and would firmly put them in the "toy" category.. To me it kinda feels like something Nintendo would make. I am not sure if I would even bother with a 1080P version. Maybe if they made a 1440P version and got rid of some of the input lag issues and the focus/abberation problem I'd give it a shot again.
> So after four hours it's already packed up and ready to go back to Amazon!
> Hopefully you like it better and can test it out in some movies.
> Just for computer gaming, my FW900 Fresnel Surround setup is like a billion times better. The Fresnel lenses and large screen are awesome for immersion (Fresnel add's just a hint of depth feel), plus you get zero ghosting, zero blur, zero lag and literally 10 times the resolution. Kinda bummed about the Sony visor but it is what it is! The tech has promise but it falls too short for me at this time.


Thanks for the review. I was really hoping that we finally had the tech for something like these "glasses" to shine. I'm disappointing though, but very glad you can return them. Only a 24" "screen"? That's really nothing considering how close they are to your face. I was expecting something much much larger in feel.

Can you elaborate on the 3D? Did it pop out at you or just add the sense of depth?
Quote:


> Originally Posted by *Mhill2029*
> 
> $800............
> 
> 
> 
> 
> 
> 
> 
> 
> these companies are making a killing on 3D gimmicky tech. Personally i can't see what all the fuss is about, and never have with 3D. It's all meh seen it all before for me. Sure tech has advanced dramatically since the days of watching Jaws in 3D with those pathetic specs of the 80's, but i just don't see the fascination with it that so many do. Maybe i'm fussy...
> When glasses free TV's etc come to head, then maybe i'll start becoming interested in it again.


Even glasses-free 3D could not get me interested. The technology is a gimmick, and they need to focus on increasing resolutions and refresh rates before pushing "picture perfect 3D". Give me more resolution and holograms before 3D!
Quote:


> Originally Posted by *CallsignVega*
> 
> I agree, 3D is overrated. It's not worth the trade off's. 2D at 120 Hz is much better than 3D at 60 Hz .


I concur most copiously. I'm excited for the day when I can upgrade to a 120hz monitor







.


----------



## CallsignVega

Quote:


> Originally Posted by *Nocturin*
> 
> Thanks for the review. I was really hoping that we finally had the tech for something like these "glasses" to shine. I'm disappointing though, but very glad you can return them. Only a 24" "screen"? That's really nothing considering how close they are to your face. I was expecting something much much larger in feel.
> Can you elaborate on the 3D? Did it pop out at you or just add the sense of depth?
> Even glasses-free 3D could not get me interested. The technology is a gimmick, and they need to focus on increasing resolutions and refresh rates before pushing "picture perfect 3D". Give me more resolution and holograms before 3D!
> I concur most copiously. I'm excited for the day when I can upgrade to a 120hz monitor
> 
> 
> 
> 
> 
> 
> 
> .


Ya, I was expecting a larger view/immersion factor than what I got. It look's like a 24" LCD on your desk but in the visor it is completely black around the image like a theater with the image floating in the middle.

The 3D was pretty decent but it wasn't like "OMG this guy is crawling out of the screen" like I've seen in some movies like Avatar 3D in theater with polarized glasses. The tech has promise but needs a lot of work.


----------



## l88bastar

I like bacon.


----------



## CallsignVega

http://www.youtube.com/watch?v=jsX7ogE8Nhg

Make sure to view in 720P.

Guild Wars 2 is as good as I thought it would be, and one of the main reasons I created this unique NVIDIA Surround setup.

Sorry about the black borders between the screens, I do not have a wide-angle video camera lens to accurately capture the seam-less image of the lenses at this time. To see that effect in action, please view the Fresnel Lens video in my channel. In a normal playing/seating position, the three images come together to create one "world" image and the Fresnel lenses add a slight depth effect. It is like looking through a large window out into the world and makes for one amazing gaming experience.

As you can see in the video, SLI is not working properly (developers are working on fix). It is only using around one GTX 680's worth of processing power, but it will read around 3x 33% as it has to send the frames to three different cards in this Surround setup.

My 4th EVGA GTX 680 malfunctioned and I have a replacement on the way. Amazingly, as you can see in the video it runs pretty darn smooth.

I do a quick tour of Divinity's Reach, the most impressive city I've ever seen in any MMORPG. Then I head out to the World vs World vs World area called the "Mists". Here you get bumped up to level 80 but you can see my Ranger still has noob clothes and skills. You still need to level up to properly play WvWvW. I chase some player down and they go hide in the fort as we try and smash the door down. Those doors are pretty strong and take a good while to destroy so I switch over to a PvP "mini-game".

Now in the PvP mini-games, you get leveled to 80 but you also get 80 PvP armor and skills. Here you can see some of the cool effects. I was playing off to the side craning my neck so the camera had a good view. Don't laugh at my playing skill because of this!









The PvP mini-game plays very well and I think I even managed a kill in there somewhere. I am glad I pre-purchased the collectors edition. GW2 is shaping up to be quite the game.


----------



## jincuteguy

Quote:


> Originally Posted by *De-Zant*
> 
> It's the samsung s23a750d with bezels removed. No native portrait mode support though, so be prepared to have to mod it a bit if you want that setup
> 
> 
> 
> 
> 
> 
> 
> 
> Vega, you planning on fresnel lenses for the FW900s? Or?


So how do you mod those S23A750D to work in portrait mode? since you said it doesn't have native portrait mode.


----------



## De-Zant

Quote:


> Originally Posted by *jincuteguy*
> 
> Quote:
> 
> 
> 
> Originally Posted by *De-Zant*
> 
> It's the samsung s23a750d with bezels removed. No native portrait mode support though, so be prepared to have to mod it a bit if you want that setup
> 
> 
> 
> 
> 
> 
> 
> 
> Vega, you planning on fresnel lenses for the FW900s? Or?
> 
> 
> 
> So how do you mod those S23A750D to work in portrait mode? since you said it doesn't have native portrait mode.
Click to expand...

Instructions found here in Vega's previous display setup thread.


----------



## jincuteguy

Quote:


> Originally Posted by *De-Zant*
> 
> Instructions found here in Vega's previous display setup thread.


That only showed how to remove the bezels and setup the monitors, it didn't say anything about how to make it to work with portrait mode?


----------



## armartins

They will work in portrait, they only don't have a base capable of pivotting


----------



## feteru

Vega, what do you think about the 690? Also, when is your 4th 680 getting back, so you can get your system fully operational again?


----------



## TA4K

Just quietly, The 690 looks like it just has 2gb per chip, so when someone tries to hook a few 2560x1600 displays up to a pair of them it will just do what the MARS II's did in bf3. LAG!


----------



## kakee

What is the use of the keyboard stand? It's just perfect for my next setup


----------



## CallsignVega

I noticed my eyes starting to get pretty tired viewing such a large image through those Fresnel lenses so I am taking my build to a much scaled back and different approach. You really can have too large of a display setup for gaming! I also find myself playing more competitively in games like BF3 and Counter-strike on a single display system (CRT).

Single FW900 @ 2048x1280 / 90 Hz, no Fresnel, single water-cooled GTX 680 overclocked to max, Asus Maximus V Gene and 3770K, 2x Vertex 4's for Raid 0 inbound. Everything will be water cooled, hopefully they come out with a MB water block. Phase change won't be a part of the loop anymore, just the current ambient side and the Geothermal side is still a go in the future.

It's amazing how well a single overclocked GTX 680 runs games on a FW900 @ 2048x1280 / 90 Hz. The smoothness and lack of any sort of lag, display response or input is incredible. I can tell the difference between the frame lag that 4-way SLI introduces and single GPU. I am back to the single GPU / single display fan club until those high-resolution / high refresh rate OLED screens come out.

I will make a post in the market section and should have some good stuff for sale. Everything from A+ FW900's, to 680's with EK blocks on them (and I have one sealed box EVGA GTX 680), Vertex 3's, a really nice 3960X, RIVE, etc etc.

I know, quite a 180 deg turn.. I may have ADD.


----------



## Bodycount

Whoa! Vega


----------



## Smo

Quote:


> Originally Posted by *CallsignVega*
> 
> I noticed my eyes starting to get pretty tired viewing such a large image through those Fresnel lenses so I am taking my build to a much scaled back and different approach. You really can have too large of a display setup for gaming! I also find myself playing more competitively in games like BF3 and Counter-strike on a single display system (CRT).
> Single FW900 @ 2048x1280 / 90 Hz, no Fresnel, single water-cooled GTX 680 overclocked to max, Asus Maximus V Gene and 3770K, 2x Vertex 4's for Raid 0 inbound. Everything will be water cooled, hopefully they come out with a MB water block. Phase change won't be a part of the loop anymore, just the current ambient side and the Geothermal side is still a go in the future.
> It's amazing how well a single overclocked GTX 680 runs games on a FW900 @ 2048x1280 / 90 Hz. The smoothness and lack of any sort of lag, display response or input is incredible. I can tell the difference between the frame lag that 4-way SLI introduces and single GPU. I am back to the single GPU / single display fan club until those high-resolution / high refresh rate OLED screens come out.
> I will make a post in the market section and should have some good stuff for sale. Everything from A+ FW900's, to 680's with EK blocks on them (and I have one sealed box EVGA GTX 680), Vertex 3's, a really nice 3960X, RIVE, etc etc.
> I know, quite a 180 deg turn.. I may have ADD.


You're as big a pain in the ass as me when it comes to changing things


----------



## DarthBaiter

Quote:


> Originally Posted by *CallsignVega*
> 
> I've had this guy on the left build me a pair of his 3D glasses. They should be here on Tuesday. (click photo)


LoL...The Jerk 3D glasses...you just cant sue Steve Martin when you go crossed eyed.


----------



## nvidiaftw12

Quote:


> Originally Posted by *CallsignVega*
> 
> I noticed my eyes starting to get pretty tired viewing such a large image through those Fresnel lenses so I am taking my build to a much scaled back and different approach. You really can have too large of a display setup for gaming! I also find myself playing more competitively in games like BF3 and Counter-strike on a single display system (CRT).
> Single FW900 @ 2048x1280 / 90 Hz, no Fresnel, single water-cooled GTX 680 overclocked to max, Asus Maximus V Gene and 3770K, 2x Vertex 4's for Raid 0 inbound. Everything will be water cooled, hopefully they come out with a MB water block. Phase change won't be a part of the loop anymore, just the current ambient side and the Geothermal side is still a go in the future.
> It's amazing how well a single overclocked GTX 680 runs games on a FW900 @ 2048x1280 / 90 Hz. The smoothness and lack of any sort of lag, display response or input is incredible. I can tell the difference between the frame lag that 4-way SLI introduces and single GPU. I am back to the single GPU / single display fan club until those high-resolution / high refresh rate OLED screens come out.
> I will make a post in the market section and should have some good stuff for sale. Everything from A+ FW900's, to 680's with EK blocks on them (and I have one sealed box EVGA GTX 680), Vertex 3's, a really nice 3960X, RIVE, etc etc.
> I know, quite a 180 deg turn.. I may have ADD.


Hmm, I live in Alabama and am wanting a fw900.







Altough I really need a 2600k and probably don't have the cash for both.


----------



## IXcrispyXI

is it just me or is this a dogs face or an illusion?


----------



## nvidiaftw12

It's a dog in the chair...


----------



## Canis-X

Must be his dog chillin in a chair?


----------



## stren

Quote:


> Originally Posted by *CallsignVega*
> 
> I noticed my eyes starting to get pretty tired viewing such a large image through those Fresnel lenses so I am taking my build to a much scaled back and different approach. You really can have too large of a display setup for gaming! I also find myself playing more competitively in games like BF3 and Counter-strike on a single display system (CRT).
> Single FW900 @ 2048x1280 / 90 Hz, no Fresnel, single water-cooled GTX 680 overclocked to max, Asus Maximus V Gene and 3770K, 2x Vertex 4's for Raid 0 inbound. Everything will be water cooled, hopefully they come out with a MB water block. Phase change won't be a part of the loop anymore, just the current ambient side and the Geothermal side is still a go in the future.
> It's amazing how well a single overclocked GTX 680 runs games on a FW900 @ 2048x1280 / 90 Hz. The smoothness and lack of any sort of lag, display response or input is incredible. I can tell the difference between the frame lag that 4-way SLI introduces and single GPU. I am back to the single GPU / single display fan club until those high-resolution / high refresh rate OLED screens come out.
> I will make a post in the market section and should have some good stuff for sale. Everything from A+ FW900's, to 680's with EK blocks on them (and I have one sealed box EVGA GTX 680), Vertex 3's, a really nice 3960X, RIVE, etc etc.
> I know, quite a 180 deg turn.. I may have ADD.


Haha you crack me up vega. I'd love to try an FW900, but I don't have the room for it lol.

BTW do the fresnel lens still work well with LCDs? I'm debating the 90hz 1440p monitors in portrait but only if I can go bezel-less


----------



## PoopaScoopa

Because of eye strain?







Visine doesn't help? You're gonna miss the immersion for GW2. I could def see single monitor being better for competitive BF3 though.


----------



## CallsignVega

Quote:


> Originally Posted by *nvidiaftw12*
> 
> Hmm, I live in Alabama and am wanting a fw900.
> 
> 
> 
> 
> 
> 
> 
> Altough I really need a 2600k and probably don't have the cash for both.


I actually have one of the pricey A+ ones for sale (keeping one as spare) but I also have an Accurate IT one that is B+ (screen surface is perfect) that I could let go for a decent price.
Quote:


> Originally Posted by *stren*
> 
> Haha you crack me up vega. I'd love to try an FW900, but I don't have the room for it lol.
> BTW do the fresnel lens still work well with LCDs? I'm debating the 90hz 1440p monitors in portrait but only if I can go bezel-less


You know I never tried one on a LCD. I figure a 27" 1440P is large enough as it is. I think the perfect screen size for gaming is in the 27-32" range.
Quote:


> Originally Posted by *PoopaScoopa*
> 
> Because of eye strain?
> 
> 
> 
> 
> 
> 
> 
> Visine doesn't help? You're gonna miss the immersion for GW2. I could def see single monitor being better for competitive BF3 though.


I will give you that, the immersion factor in GW2 on the setup was just crazy awesome. Far beyond any of my LCD Eyefinity/Surround setups. It was also incredibly smooth. The magnified image was so large though and in order to get the aspect ratio with the lenses just right I had to sit pretty close. It really is like being in the world but I found myself having to move my eyes and head way too much and it was actually a hinderance in PvP. With a single top screen like the FW900 you can get virtually all of the action in central vision and you can percieve and react to things faster. That's why I think you can have too large an image for gaming unless you are going for pure immersion.

Really there are two ways you can go: immersion and competitiveness. It is very hard to do both. I am quite a competitive person so I am going with the new simpler build for that.


----------



## armartins

Vega don't you think you're getting tired because of the CRT nature itself? After the 5970 with mini-dp and DVI cables I came back to single GPU smoothness and I can tell the difference the 3 DP on my 6970DCII had bring me both in WoW (was already a PITA to run eyefinity in DX11 due to small framebuffer and the lack of multi-GPU support in windowed mode, required for eyefinity resolution) but the real difference was on BFBC2 days prior to BF3... the 6970 OC`d has just the needed power to run it at 50-60FPS even with 8xMSAA and 8xMSAA wasn't achievable again due to framebuffer limitation on my WC'd 5970 at the same 4xAA all maximum settings the 6970 blew away my previous WC'd 5970 in smoothness not power obviously and locked 60FPS with 4xAA 100% smooth was totally achieved by the 6970 . The problem is that I just can't go back to single screen... I was very excited with your comments about SLI'd 680s smoothness, and I was already planning on getting a pair of GK110s when they arrived so I could use 3 x DVI connections hopefully with a bigger framebuffer instead of getting a single 7970 DCII/Lightning and OC it to its maximum while running my setup from 3x DP connections. Honestly I was never too found with your CRT setup, the fresnel idea to minimize the gaps was really clever. But in my opinion your best monitor setup still is the 3x 1080p sansung build that bezels were insanely thin. Right now I'm challenging you to build a 3x Catleap 1440P IPS 2B OC'd to 120Hz =). If the inner bezels are thin enough (I must know that!) this would be the most epic possible 3x LCD high resolution, high refresh rate, IPS setup while keeping the costs ridiculous low compared with 3x sansung 1080P non IPS pannels. And my guess is that it would eliminate your eye sore from CRT+Fresnel.

*Oh crap! "Really there are two ways you can go: immersion and competitiveness. It is very hard to do both. I am quite a competitive person so I am going with the new simpler build for that." answered while I was typing.*


----------



## CallsignVega

Quote:


> Originally Posted by *armartins*
> 
> Vega don't you think you're getting tired because of the CRT nature itself? After the 5970 with mini-dp and DVI cables I came back to single GPU smoothness and I can tell the difference the 3 DP on my 6970DCII had bring me both in WoW (was already a PITA to run eyefinity in DX11 due to small framebuffer and the lack of multi-GPU support in windowed mode, required for eyefinity resolution) but the real difference was on BFBC2 days prior to BF3... the 6970 OC`d has just the needed power to run it at 50-60FPS even with 8xMSAA and 8xMSAA wasn't achievable again due to framebuffer limitation on my WC'd 5970 at the same 4xAA all maximum settings the 6970 blew away my previous WC'd 5970 in smoothness not power obviously and locked 60FPS with 4xAA 100% smooth was totally achieved by the 6970 . The problem is that I just can't go back to single screen... I was very excited with your comments about SLI'd 680s smoothness, and I was already planning on getting a pair of GK110s when they arrived so I could use 3 x DVI connections hopefully with a bigger framebuffer instead of getting a single 7970 DCII/Lightning and OC it to its maximum while running my setup from 3x DP connections. Honestly I was never too found with your CRT setup, the fresnel idea to minimize the gaps was really clever. But in my opinion your best monitor setup still is the 3x 1080p sansung build that bezels were insanely thin. Right now I'm challenging you to build a 3x Catleap 1440P IPS 2B OC'd to 120Hz =). If the inner bezels are thin enough (I must know that!) this would be the most epic possible 3x LCD high resolution, high refresh rate, IPS setup while keeping the costs ridiculous low compared with 3x sansung 1080P non IPS pannels. And my guess is that it would eliminate your eye sore from CRT+Fresnel.


I would say the tired eyes wasn't because of the CRT, it was the combination of sitting in front of a massive magnified image. Sitting about 2 feet from a 52" screen is great for immersion though lol.

You know I already contemplated 3x IPS 120Hz 2B versions, but I did some research and the pixel response time is way too slow for me (the higher Hz cannot change that). If the 2ms 120 Hz TN panels were to slow, those would be extremely slow for me. Sorry, no more stops on the LCD train for me. It is CRT and then OLED. Like I said Surround/Eyefinity is great for casual gameplay but for more competitive gaming I think it slows you down.


----------



## stren

Quote:


> Originally Posted by *armartins*
> 
> Vega don't you think you're getting tired because of the CRT nature itself? After the 5970 with mini-dp and DVI cables I came back to single GPU smoothness and I can tell the difference the 3 DP on my 6970DCII had bring me both in WoW (was already a PITA to run eyefinity in DX11 due to small framebuffer and the lack of multi-GPU support in windowed mode, required for eyefinity resolution) but the real difference was on BFBC2 days prior to BF3... the 6970 OC`d has just the needed power to run it at 50-60FPS even with 8xMSAA and 8xMSAA wasn't achievable again due to framebuffer limitation on my WC'd 5970 at the same 4xAA all maximum settings the 6970 blew away my previous WC'd 5970 in smoothness not power obviously and locked 60FPS with 4xAA 100% smooth was totally achieved by the 6970 . The problem is that I just can't go back to single screen... I was very excited with your comments about SLI'd 680s smoothness, and I was already planning on getting a pair of GK110s when they arrived so I could use 3 x DVI connections hopefully with a bigger framebuffer instead of getting a single 7970 DCII/Lightning and OC it to its maximum while running my setup from 3x DP connections. Honestly I was never too found with your CRT setup, the fresnel idea to minimize the gaps was really clever. But in my opinion your best monitor setup still is the 3x 1080p sansung build that bezels were insanely thin. Right now I'm challenging you to build a 3x Catleap 1440P IPS 2B OC'd to 120Hz =). If the inner bezels are thin enough (I must know that!) this would be the most epic possible 3x LCD high resolution, high refresh rate, IPS setup while keeping the costs ridiculous low compared with 3x sansung 1080P non IPS pannels. And my guess is that it would eliminate your eye sore from CRT+Fresnel.
> *Oh crap! "Really there are two ways you can go: immersion and competitiveness. It is very hard to do both. I am quite a competitive person so I am going with the new simpler build for that." answered while I was typing.*


Well I guess if you do the screens in landscape mode, you can disable SLI, and run one screen + one card for competiveness and run three monitors + 2/3 more gpus for immersion depending on the game.

I'm competitive, but I'm not good at fps, so I gave up on trying to minimize lag and just like to make things pretty now, so at least it looks good while I wait to respawn lol.


----------



## Lord Xeb

O-o What did I step into? *subbed*


----------



## CallsignVega

Quote:


> Originally Posted by *stren*
> 
> Well I guess if you do the screens in landscape mode, you can disable SLI, and run one screen + one card for competiveness and run three monitors + 2/3 more gpus for immersion depending on the game.
> I'm competitive, but I'm not good at fps, so I gave up on trying to minimize lag and just like to make things pretty now, so at least it looks good while I wait to respawn lol.


I though about that, but I've never been a big fan of landscape as you can see by all my builds lol. I think my eyes are a little more senstive than most as I fly with night vission goggles for hours every night so my eyes are already stressed when I get home.


----------



## marbleduck

Wat? Sony FW900s for sale, you say?

Me gustaría un FW900.


----------



## Nocturin

Wow. Big turn around.

I'm glad you've joined the single monitor/single gpu club







.

If I could afford it I'd have 1 awesome monitor and 1 2nd tier GPU.

And wow you go through systems quick, just long enough to get used to them







.


----------



## Hydrored

Hi Vega I have been following your thread for a long time and I hate to say I'm not surprised with the big change. What is your input on SLI with a single monitor as far as micro stutter and smoothness?

Thanks for your input and your service.


----------



## CallsignVega

Quote:


> Originally Posted by *Hydrored*
> 
> Hi Vega I have been following your thread for a long time and I hate to say I'm not surprised with the big change. What is your input on SLI with a single monitor as far as micro stutter and smoothness?
> Thanks for your input and your service.


Actually, the 680's do quite well in SLI in that regard but an overclocked 680 is fast enough to run all the titles I play really well on a single FW900. My 680's were barely even being used with games like Guild Wars 2 yet I got decent FPS in Surround. I just prefer single GPU as everything is as smooth as technically possible with no delay. (And zero headaches). You would not believe the hours/work it took to get three FW900's working right at a custom resolution/refresh rate. I basically had to bug out the NVIDIA drivers and go around them to Win 7 and message it's settings. My NVIDIA control panel wouldn't even start anymore, but dangit' I had my settings!

I've also noticed putting my FW900 back in portrait it seems clearer. Having them in portrait threw off their geometry just a bit and made them not as clear. Looking forward to getting my parts for my "mini-me" system tomorrow. I also did a test in single display mode with Witcher 2 with DOF which is about the most demanding PC game I have seen with everything max. With one 680 with everything max and no DOF I get around 90 FPS and with DOF I get around 30! So just to get back up to 90 FPS it would take 3-4 680's in SLI, just silly considering the minimal visual improvement. There are ton's of things like DOF that take huge performance to run for minimal gains.


----------



## Hydrored

Quote:


> Originally Posted by *CallsignVega*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Hydrored*
> 
> Hi Vega I have been following your thread for a long time and I hate to say I'm not surprised with the big change. What is your input on SLI with a single monitor as far as micro stutter and smoothness?
> Thanks for your input and your service.
> 
> 
> 
> Actually, the 680's do quite well in SLI in that regard but an overclocked 680 is fast enough to run all the titles I play really well on a single FW900. My 680's were barely even being used with games like Guild Wars 2 yet I got decent FPS in Surround. I just prefer single GPU as everything is as smooth as technically possible with no delay. (And zero headaches). You would not believe the hours/work it took to get three FW900's working right at a custom resolution/refresh rate. I basically had to bug out the NVIDIA drivers and go around them to Win 7 and message it's settings. My NVIDIA control panel wouldn't even start anymore,m but dangit' I had my settings!
> 
> I've also noticed putting my FW900 back in portrait it seems clearer. Having them in portrait threw off their geometry just a bit and made them not as clear. Looking forward to getting my parts for my "mini-me" system tomorrow. I also did a test in single display mode with Witcher 2 with DOF which is about the most demanding PC game I have seen with everything max. With one 680 with everything max and no DOF I get around 90 FPS and with DOF I get around 30! So just to get back up to 90 FPS it would take 3-4 680's in SLI, just silly considering the minimal visual improvement. There are ton's of things like DOF that take huge performance to run for minimal gains.
Click to expand...

Thanks. I'm in the process.of my new build now and I have everything but the gpu's. I currently have 3 24" screens. I want to get rid of them and get a single large monitor but I figured I would at lease need sli/cf to give me the horse power for 120hz and ultra settings in bf3. I still cannot decide between 7970's or 680's.


----------



## CallsignVega

Quote:


> Originally Posted by *Hydrored*
> 
> Thanks. I'm in the process.of my new build now and I have everything but the gpu's. I currently have 3 24" screens. I want to get rid of them and get a single large monitor but I figured I would at lease need sli/cf to give me the horse power for 120hz and ultra settings in bf3. I still cannot decide between 7970's or 680's.


I run BF3 with everything on Ultra at 2048x1280 (besides MSAA which I don't need) and can stay above my refresh rate of 90 Hz 99% of the time. If you get a 1440P or 1600P screen and like to run 4X MSAA due to being an LCD I would recommend two cards. 7970's are great cards and really play catch-up in Eyefinity, but I think the 680 is definitely a better card for single screen / multi-GPU. Faster and uses less power + SLI smoothness. If you need a couple of water cooled 680's let me know.


----------



## Hydrored

Quote:


> Originally Posted by *CallsignVega*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Hydrored*
> 
> Thanks. I'm in the process.of my new build now and I have everything but the gpu's. I currently have 3 24" screens. I want to get rid of them and get a single large monitor but I figured I would at lease need sli/cf to give me the horse power for 120hz and ultra settings in bf3. I still cannot decide between 7970's or 680's.
> 
> 
> 
> I run BF3 with everything on Ultra at 2048x1280 (besides MSAA which I don't need) and can stay above my refresh rate of 90 Hz 99% of the time. If you get a 1440P or 1600P screen and like to run 4X MSAA due to being an LCD I would recommend two cards. 7970's are great cards and really play catch-up in Eyefinity, but I think the 680 is definitely a better card for single screen / multi-GPU. Faster and uses less power + SLI smoothness. If you need a couple of water cooled 680's let me know.
Click to expand...

More than likely it will be 7970 because of their compute power
This machine will also be boincing. I have a 2P/16 core for 24/7 boinc but also would would like the 3930k cranking points when not gaming. I really want 680s but I think I'm stuck going AMD.


----------



## iCrap

Wow huge turnaround. I don't think i could give up my 3 screens. 0_o


----------



## dmasteR

Quote:


> Originally Posted by *CallsignVega*
> 
> I would say the tired eyes wasn't because of the CRT, it was the combination of sitting in front of a massive magnified image. Sitting about 2 feet from a 52" screen is great for immersion though lol.
> You know I already contemplated 3x IPS 120Hz 2B versions, but I did some research and the pixel response time is way too slow for me (the higher Hz cannot change that). If the 2ms 120 Hz TN panels were to slow, those would be extremely slow for me. Sorry, no more stops on the LCD train for me. It is CRT and then OLED. *Like I said Surround/Eyefinity is great for casual gameplay but for more competitive gaming I think it slows you down*.


We argued about this back in the BF3 thread way back when the game came out and you would always tell me the opposite in how the three display gave you advantage.









Surprised you have changed your mind.









When are you putting the FW900 up for sale in the market? I have a buddy who may be interested in one if the price is right!


----------



## CallsignVega

Quote:


> Originally Posted by *dmasteR*
> 
> We argued about this back in the BF3 thread way back when the game came out and you would always tell me the opposite in how the three display gave you advantage.
> 
> 
> 
> 
> 
> 
> 
> 
> Surprised you have changed your mind.
> 
> 
> 
> 
> 
> 
> 
> 
> When are you putting the FW900 up for sale in the market? I have a buddy who may be interested in one if the price is right!


Portrait Surround gives you an advantage in some areas, like making it easy to snipe since distant objects are larger. Although in a lot of areas like quick close-quarter combat it is just too big to absorb all of the information on the screen.

The calibrated A+ FW900 that is like new that I would ship isn't that cheap and the shipping would be around $190. But I am willing to entertain offers. The B+ Accurate IT one is much cheaper and available for local pickup only unless someone want's to spring for the shipping.

On another note:

While I was fleshing out all of my GTX 680's to find which was the best over-clocker I found out I have a beast card. I always wondered why this card all the way at the end in my 4-way SLI config operated so much cooler than the others. It is crazy efficient.

1328 MHz core, 3580 MHz memory at stock voltage! Also EVGA Scanner X and OCCT stable at these frequencies. My other three GTX 680's would top out around 1240-1260 MHz core when used on their own.

This core is so efficient it idles at 18 C! Anyone else see clocks like these? It will make a nice single GPU for my new mini-build.

http://www.youtube.com/watch?v=th4pN8hBlQk&feature=youtu.be


----------



## dmasteR

Quote:


> Originally Posted by *CallsignVega*
> 
> Portrait Surround gives you an advantage in some areas, like making it easy to snipe since distant objects are larger. Although in a lot of areas like quick close-quarter combat it is just too big to absorb all of the information on the screen.
> The calibrated A+ FW900 that is like new that I would ship isn't that cheap and the shipping would be around $190. But I am willing to entertain offers. The B+ Accurate IT one is much cheaper and available for local pickup only unless someone want's to spring for the shipping.
> On another note:
> While I was fleshing out all of my GTX 680's to find which was the best over-clocker I found out I have a beast card. I always wondered why this card all the way at the end in my 4-way SLI config operated so much cooler than the others. It is crazy efficient.
> 1328 MHz core, 3580 MHz memory at stock voltage! Also EVGA Scanner X and OCCT stable at these frequencies. My other three GTX 680's would top out around 1240-1260 MHz core when used on their own.
> This core is so efficient it idles at 18 C! Anyone else see clocks like these? It will make a nice single GPU for my new mini-build.
> http://www.youtube.com/watch?v=th4pN8hBlQk&feature=youtu.be


Whenever you decide to put it up in the market, do post in this thread. As I'm sure a few of us may be interested.


----------



## csm725

Quote:


> Originally Posted by *marbleduck*
> 
> Wat? Sony FW900s for sale, you say?
> Me gustaría un FW900.


I'd also like an FW900 but I don't have the money! D:


----------



## thrasherht

hey vega, the build is amazing. but part way through viewing the thread, you photobucket went over its bandwidth limit, so none of the pictures work anymore.


----------



## armartins

Waiting for "Vega's still Heavyweight Display, lightweight computer; edition 2012 update =)


----------



## CallsignVega

Stupid photobucket. I love how instead of resetting your bandwidth limit back to the free model limit of 10GB when you stop paying, it locks you as having already used all your bandwidth. Just a cheap way to get you to keep paying. I need to find a better service. Is imageshack any good/truly free?


----------



## Nocturin

Quote:


> Originally Posted by *CallsignVega*
> 
> Stupid photobucket. I love how instead of resetting your bandwidth limit back to the free model limit of 10GB when you stop paying, it locks you as having already used all your bandwidth. Just a cheap way to get you to keep paying. I need to find a better service. Is imageshack any good/truly free?


Facebook and imgur are decent too







.


----------



## stren

Quote:


> Originally Posted by *CallsignVega*
> 
> Stupid photobucket. I love how instead of resetting your bandwidth limit back to the free model limit of 10GB when you stop paying, it locks you as having already used all your bandwidth. Just a cheap way to get you to keep paying. I need to find a better service. Is imageshack any good/truly free?


Imgur is great, but only free for the first ~220 ish photos.


----------



## nvidiaftw12

Facebook. :lachen:


----------



## Samurai Batgirl

Have you tried Picasa?


----------



## G33K

Imgur/Minus. Not sure if there even is a limit for Imgur, but Minus has 10 gigs and it preserves your EXIF data (for photographers)


----------



## PoopaScoopa

Quote:


> Originally Posted by *stren*
> 
> Imgur is great, but only free for the first ~220 ish photos.


Can't you just create another account?







CallSignVega2/etc. Imgur is def the best image host out there in my experience.


----------



## nvidiaftw12

Need too many emails.


----------



## PoopaScoopa

Quote:


> Originally Posted by *nvidiaftw12*
> 
> Need too many emails.


If you use gmail, you can always do the [email protected] and have it still go to [email protected] Easy to create filters for incoming mail directed to the +whatever as well.


----------



## nvidiaftw12

Interesting.


----------



## rdrdrdrd

i shed tears for the awesome setup that only was for a short time.... It seems you have the money; so.... Why not both? lol








just add a fourth FW900 on a separate desk from the 4th 680 and disable SLI when you want to play competitively, second keyboard and mouse wouldn't be a problem either.


----------



## citizenjon

*I'd give anything to see upcoming Aliens game on this baby.*

I stumbled onto this thread researching my first build, that was like 8 hours ago, at first I thought OP's project was taking the piss - geothermal heat diffusion?! Fresnel lens display?!! But then I watched Youtube vids, and I have to say, from a 100% noob's pov this is mind blowing stuff.
This build is f**king LEGEND!
Thanks for all the work that went into this highly entertaining/educational thread.

Vega have you ever thought about trying out nVidia 3D Vision 2 glasses with this monster?

Kudos and subbed.


----------



## CallsignVega

Quote:


> Originally Posted by *rdrdrdrd*
> 
> i shed tears for the awesome setup that only was for a short time.... It seems you have the money; so.... Why not both? lol
> 
> 
> 
> 
> 
> 
> 
> 
> just add a fourth FW900 on a separate desk from the 4th 680 and disable SLI when you want to play competitively, second keyboard and mouse wouldn't be a problem either.


Hmm, now you got me wondering. Just going through my head what would be required to switch back and forth and how much of a pain that would be. But that would be the best of both worlds. Think upon this hard I will.









Quote:


> Originally Posted by *citizenjon*
> 
> 
> *I'd give anything to see upcoming Aliens game on this baby.*
> I stumbled onto this thread researching my first build, that was like 8 hours ago, at first I thought OP's project was taking the piss - geothermal heat diffusion?! Fresnel lens display?!! But then I watched Youtube vids, and I have to say, from a 100% noob's pov this is mind blowing stuff.
> This build is f**king LEGEND!
> Thanks for all the work that went into this highly entertaining/educational thread.
> Vega have you ever thought about trying out nVidia 3D Vision 2 glasses with this monster?
> Kudos and subbed.


Ya, that Aliens game looks kinda cool. Aliens has always been one of my favorite franchises. Except, developers always seem to manage to screw things up. I tried the NVIDIA 3d v2 glasses, didn't care for them too much. Too much flicker for me.


----------



## armartins

+1 to the dual PC idea! I'd also would find a way to use a single WC loop or something like that... for both machines at once... it would be legend - wait for it - ary!


----------



## citizenjon

True enough, much of the recent Aliens incarnations have been disappointing but the developers for this are total Aliens fanboys, plus it's been in the pipeline for 4 years - It's got to measure up. That's why I decided to try and build my first rig, console gaming is graphically very limiting.

Hey would you give me an opinion on the parts I've selected for my first build?
My finances are limited so no sli for me.

PSU: Corsair CMPSU-750TXV2UK 
MOBO: Gigabyte SKT-2011 X79-UD3 (Rev 1.0)
Case: Corsair Obsidian Series 650D
CPU: i5-2500K (3.30GHz, 6MB Cache, Socket 1155)
Monitor: Asus VG278H (I'm dreaming here - can't actually afford this)
GPU: ?? Something good but cheap (I'll upgrade someday)

I've already got a Vertex 2 60GB and a Kingston 128GB v but I've no idea about cooling a CPU or the RAM I'll need.
I've never oced anything in my life but I'd like to start.
Any advice from anybody much appreciated.


----------



## Eggy88

Quote:


> Originally Posted by *armartins*
> 
> it would be legen - wait for it - *d*ary!


Sry just had to fix this.


----------



## rdrdrdrd

Quote:


> Originally Posted by *CallsignVega*
> 
> Hmm, now you got me wondering. Just going through my head what would be required to switch back and forth and how much of a pain that would be. But that would be the best of both worlds. Think upon this hard I will.


Well if you make a display profile (or is that just amd?) for surround you can disable it for single monitor play and have the fourth monitor be the 'primary' monitor so games would maximize to it.
Put it on an adjacent desk with an identical mouse and keyboard and just leave them plugged in all the time. the HD650 have a ten foot cord too









Quote:


> Originally Posted by *citizenjon*
> 
> 
> *I'd give anything to see upcoming Aliens game on this baby.*


gosh.... that alien mouth looks exactly like a vagina... wth?


----------



## nvidiaftw12

:lachen: ^


----------



## Smo

Quote:


> Originally Posted by *rdrdrdrd*
> 
> gosh.... that alien mouth looks exactly like a vagina... wth?


Maybe so you don't try to fight it


----------



## CallsignVega

I am toying around with the idea of two different systems more or less to have the best of both worlds. The single FW900/GPU for awesome FPS in which it's best to have everything central vision with instant response and no lag, and a second system where lag/motion blur isn't that big of a deal for games like Guild Wars 2.

My desk is big enough to actually support both setups lol. Been looking into those 27" 1440P IPS catleap monitors that you can overclock to 100-120Hz. If I wanted to go real big, I could get five of those for portrait and de-bezel them. That would limit me to AMD though for that particular system, and the 7990 to boot. Stupidly the 6GB cards that still haven't appeared wouldn't even have the necessary outputs for a five monitor setup. The 7990 releasing in July will be the only card to have the ports to do it properly. (4x DP and 1x Dual Link DVI).

The good thing about the IPS screens is with the 5x1 setup I wouldn't run into the viewing angle problems that I had experienced on my previous 120 Hz portrait TN setup. The issue I will run into is that I would need 4x DP->DVI adapters that run at a max of 330 MHz. So that means the monitor's would run at 89 Hz max. Still, that is quite a nice bump up from 60 Hz. Around 90 Hz the pixel response time on those non-overdrive catleap's matches up with that refresh rate. Another large issue with AMD is how will Eyefinity deal/handle those custom refresh rates? Has anyone ever seen anyone run at least three catleap's in Eyefinity at the higher refresh rate?

So the setup would be 7200x2560 resolution, or a gargantuan 18.4 million pixels! Another issue would be; would the 3GB VRAM on the 7990 be a limit? How far would I need to turn settings down? I think I would be alright in most games. Guild Wars 2 didn't use much VRAM at all in my testing. With my Rampage IV Extreme and two water cooled 7990's, they would have their own 16x PCI-E 3.0 slot to swap those massive frames in Quad-Fire.

Thoughts?


----------



## nvidiaftw12

Quote:


> Originally Posted by *CallsignVega*
> 
> I am toying around with the idea of two different systems more or less to have the best of both worlds. The single FW900/GPU for awesome FPS in which it's best to have everything central vision with instant response and no lag, and a second system where lag/motion blur isn't that big of a deal for games like Guild Wars 2.
> My desk is big enough to actually support both setups lol. Been looking into those 27" 1440P IPS catleap monitors that you can overclock to 100-120Hz. If I wanted to go real big, I could get five of those for portrait and de-bezel them. That would limit me to AMD though for that particular system, and the 7990 to boot. Stupidly the 6GB cards that still haven't appeared wouldn't even have the necessary outputs for a five monitor setup. The 7990 releasing in July will be the only card to have the ports to do it properly. (4x DP and 1x Dual Link DVI).
> The good thing about the IPS screens is with the 5x1 setup I wouldn't run into the viewing angle problems that I had experienced on my previous 120 Hz portrait TN setup. The issue I will run into is that I would need 4x DP->DVI adapters that run at a max of 330 MHz. So that means the monitor's would run at 89 Hz max. Still, that is quite a nice bump up from 60 Hz. Around 90 Hz the pixel response time on those non-overdrive catleap's matches up with that refresh rate. Another large issue with AMD is how will Eyefinity deal/handle those custom refresh rates? Has anyone ever seen anyone run at least three catleap's in Eyefinity at the higher refresh rate?
> So the setup would be 7200x2560 resolution, or a gargantuan 18.4 million pixels! Another issue would be; would the 3GB VRAM on the 7990 be a limit? How far would I need to turn settings down? I think I would be alright in most games. Guild Wars 2 didn't use much VRAM at all in my testing. With my Rampage IV Extreme and two water cooled 7990's, they would have their own 16x PCI-E 3.0 slot to swap those massive frames in Quad-Fire.
> Thoughts?


You could go with 4 6gb 7970s.


----------



## CallsignVega

Quote:


> Originally Posted by *nvidiaftw12*
> 
> You could go with 4 6gb 7970s.


Take a look at my post again.









Really stupid output config IMO:



Even in the article you can read them scolding for the lack of proper outputs to do large Eyefinity setups (which is the sole reason anyone would ever bother with a 6GB GPU)







:

http://translate.google.com/translate?sl=auto&tl=en&js=n&prev=_t&hl=en&ie=UTF-8&layout=2&eotf=1&u=http%3A%2F%2Foclab.ru%2Ftopic%2Fvozdushnyiy-chempion-ili-obzor-sapphire-hd-7970-toxic


----------



## nvidiaftw12

Can't you plug in displays to multiple cards and have it work? Ah, de-zant is here, he will tell.


----------



## De-Zant

Quote:


> Originally Posted by *nvidiaftw12*
> 
> Can't you plug in displays to multiple cards and have it work? Ah, de-zant is here, he will tell.


All the displays in a single display group must be connected to a single card. This is why the connectors on the 6gb card above is such an issue here.


----------



## burksdb

Quote:


> Originally Posted by *nvidiaftw12*
> 
> Can't you plug in displays to multiple cards and have it work? Ah, de-zant is here, he will tell.


not with ati they still run everything thru one card meaning all data goes thru the pci bus and crossfire bridge.

nvidia i believe just confronted that problem with the 680's and with nvsurround you can hook into multiple cards.


----------



## rdrdrdrd

Quote:


> Originally Posted by *CallsignVega*
> 
> I am toying around with the idea of two different systems more or less to have the best of both worlds. The single FW900/GPU for awesome FPS in which it's best to have everything central vision with instant response and no lag, and a second system where lag/motion blur isn't that big of a deal for games like Guild Wars 2.
> My desk is big enough to actually support both setups lol. Been looking into those 27" 1440P IPS catleap monitors that you can overclock to 100-120Hz. If I wanted to go real big, I could get five of those for portrait and de-bezel them. That would limit me to AMD though for that particular system, and the 7990 to boot. Stupidly the 6GB cards that still haven't appeared wouldn't even have the necessary outputs for a five monitor setup. The 7990 releasing in July will be the only card to have the ports to do it properly. (4x DP and 1x Dual Link DVI).
> The good thing about the IPS screens is with the 5x1 setup I wouldn't run into the viewing angle problems that I had experienced on my previous 120 Hz portrait TN setup. The issue I will run into is that I would need 4x DP->DVI adapters that run at a max of 330 MHz. So that means the monitor's would run at 89 Hz max. Still, that is quite a nice bump up from 60 Hz. Around 90 Hz the pixel response time on those non-overdrive catleap's matches up with that refresh rate. Another large issue with AMD is how will Eyefinity deal/handle those custom refresh rates? Has anyone ever seen anyone run at least three catleap's in Eyefinity at the higher refresh rate?
> So the setup would be 7200x2560 resolution, or a gargantuan 18.4 million pixels! Another issue would be; would the 3GB VRAM on the 7990 be a limit? How far would I need to turn settings down? I think I would be alright in most games. Guild Wars 2 didn't use much VRAM at all in my testing. With my Rampage IV Extreme and two water cooled 7990's, they would have their own 16x PCI-E 3.0 slot to swap those massive frames in Quad-Fire.
> Thoughts?


I think at this point you have just become obsessed with the build and not the result









IMHO; you have the best possible setup right now with the FW900s. If you really want a single monitor CRT setup get another one and take the fourth (or fifth for crying out loud







) 680 from your current setup and throw it on another board with Ivy and fast ram. Maybe you should build a case, leave the monitors alone for a bit and get into modding and you will end up with a whole ton less money spent to satisfy your desire to build something epic lol







.


----------



## CallsignVega

lol ya I spend more time building stuff to play games than actually playing the games.


----------



## XPC

Quote:


> Originally Posted by *burksdb*
> 
> not with ati they still run everything thru one card meaning all data goes thru the pci bus and crossfire bridge.
> nvidia i believe just confronted that problem with the 680's and with nvsurround you can hook into multiple cards.


Actually you've been able to connect monitors to multiple GPUs from the start with NVsurround, In fact, that was the only way it was possible until the 680, since none of Nvidia's cards before it have been capable of driving more than 2 displays simultaneously.


----------



## Jmatt110

Why not wait and see if one of the AIB's makes a 12gb 7990?


----------



## rindoze

I can't see a single one of your pictures. It says that your to popular and exceeded your bandwidth on all of them :O my eyes are curious as to what lay behind these banners!


----------



## armartins

Quote:


> Originally Posted by *Eggy88*
> 
> Sry just had to fix this.


Actually thanks man! By your avatar I can see you take the expression seriously lol. Actually I was considering where would I put the "d" and in the end forgot it =).

@Vega Well It's not "these days" but it could be an usefull info. 2 years ago in my reference 5970 with 3x dell 2209WA (1050P, e-IPS) using the apple mini-DP to Dual-DVI adapter I was able to run then @75Hz with hacked drivers and it was noticieable better than @60Hz also I could proof it was working by those refresh rate white-black tests. Since we are two generations after that setup I guess you would be ok at least in the "custom refresh rate with eyefinity" department.


----------



## CallsignVega

Quote:


> Originally Posted by *armartins*
> 
> Actually thanks man! By your avatar I can see you take the expression seriously lol. Actually I was considering where would I put the "d" and in the end forgot it =).
> @Vega Well It's not "these days" but it could be an usefull info. 2 years ago in my reference 5970 with 3x dell 2209WA (1050P, e-IPS) using the apple mini-DP to Dual-DVI adapter I was able to run then @75Hz with hacked drivers and it was noticieable better than @60Hz also I could proof it was working by those refresh rate white-black tests. Since we are two generations after that setup I guess you would be ok at least in the "custom refresh rate with eyefinity" department.


Ah OK, good to know.


----------



## stren

Quote:


> Originally Posted by *CallsignVega*
> 
> lol ya I spend more time building stuff to play games than actually playing the games.


Haha so true, I do this as well. I've played about 2 hours worth of games in 2012, but yet spent hour and hours on the latest rig lol.


----------



## rindoze

damnit get your pictures working already


----------



## CallsignVega

I did some testing on the Maximus V GENE motherboard with a highly overclocked GTX 680 (1320 MHz core), having it's own dedicated 16x 3.0 slot it was 5% faster than when the slot was bumped down to 8x PCI-E 3.0. So even with a single GPU the differences between 16x 2.0 (8x 3.0) and 16x 3.0 can start to surface. Granted 5% isn't huge but it isn't completely insignificant either and is reproducible as many runs were tested.

I've put a moratorium on selling my 3960X/RIVE setup as I might need those dual native 16x 3.0 slots. I have three over-clockable 1440P 27" Catleaps inbound from various sources to test. Going to de-bezel one to see how small it can get. If all goes well, might get two more and get two AMD 7990's when they launch in June for 5x1 portrait to sit next to my FW900. Two 7990's talking to each other and swapping 18.4 million pixel frames is going to need some serious PCI-E bandwidth. Hopefully 16x/16x 3.0 X79 native will fit the bill, not sure if 8x/8x native on the M5G would as even with a single GPU/single monitor performance has already dropped.


----------



## rdrdrdrd

i wondered how you could have so many awesome setups and not be satisfied, and then it hip me, you're a displayphile







, you should create eye-fi.com or something lol. that 5% difference is important, it's the difference between a 570 and a 580







.


----------



## nvidiaftw12

This build will never be finalized.


----------



## Nocturin

Quote:


> Originally Posted by *nvidiaftw12*
> 
> This build will never be finalized.


what build? Vega's system(s) are living, evolving organisms.


----------



## stren

Quote:


> Originally Posted by *Nocturin*
> 
> what build? Vega's system(s) are living, evolving *orgasms*.


fixed that for you


----------



## stren

Quote:


> Originally Posted by *CallsignVega*
> 
> I did some testing on the Maximus V GENE motherboard with a highly overclocked GTX 680 (1320 MHz core), having it's own dedicated 16x 3.0 slot it was 5% faster than when the slot was bumped down to 8x PCI-E 3.0. So even with a single GPU the differences between 16x 2.0 (8x 3.0) and 16x 3.0 can start to surface. Granted 5% isn't huge but it isn't completely insignificant either and is reproducible as many runs were tested.
> I've put a moratorium on selling my 3960X/RIVE setup as I might need those dual native 16x 3.0 slots. I have three over-clockable 1440P 27" Catleaps inbound from various sources to test. Going to de-bezel one to see how small it can get. If all goes well, might get two more and get two AMD 7990's when they launch in June for 5x1 portrait to sit next to my FW900. Two 7990's talking to each other and swapping 18.4 million pixel frames is going to need some serious PCI-E bandwidth. Hopefully 16x/16x 3.0 X79 native will fit the bill, not sure if 8x/8x native on the M5G would as even with a single GPU/single monitor performance has already dropped.


Finally - I've been waiting for someone to debezel three+ and run them in portrait!

So no plans to use the PLX chip boards?


----------



## Blizlake

Quote:


> Originally Posted by *stren*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Nocturin*
> 
> what build? Vega's system(s) are living, evolving *orgasms*.
> 
> 
> 
> fixed that for you
Click to expand...

LOL









I'll have to hint my friend about this, he's been searching for pics of debezeled catleaps


----------



## CrazzyRussian

Looking forward to the debezeling of the IPS monitors and see how small you can get the bezels. If your results are good I'll buy another pair of the monitors and usually game on 1/ eyefinity for non-extremely intensive games


----------



## CallsignVega

Quote:


> Originally Posted by *stren*
> 
> Finally - I've been waiting for someone to debezel three+ and run them in portrait!
> So no plans to use the PLX chip boards?


I haven't ruled it out for the Z77 boards, or even the X79 boards! I've already found 8x PCI-E 3.0 native slot slightly limiting performance on a single GPU/monitor versus 16x 3.0, so I could only imagine running something like two 7990's running 18.4 million pixels in two times 8x 3.0 slots. I could go PLX 16x/16x on Z77 or native 16x/16x 3.0 on X79. Not sure if there would be much of a difference there but I bet there would be a large difference versus 8x/8x native Z77.

Quote:


> Originally Posted by *rdrdrdrd*
> 
> i wondered how you could have so many awesome setups and not be satisfied, and then it hip me, you're a displayphile
> 
> 
> 
> 
> 
> 
> 
> , you should create eye-fi.com or something lol. that 5% difference is important, it's the difference between a 570 and a 580
> 
> 
> 
> 
> 
> 
> 
> .


Haha I don't know if I am a displayphile or just touched in the head.









I ordered one of these bad boys that will be able to run 5x1 portrait catleap's or 3x1 landscape catleap's de-bezeled:


----------



## stren

Quote:


> Originally Posted by *CallsignVega*
> 
> I haven't ruled it out for the Z77 boards, or even the X79 boards! I've already found 8x PCI-E 3.0 native slot slightly limiting performance on a single GPU/monitor versus 16x 3.0, so I could only imagine running something like two 7990's running 18.4 million pixels in two times 8x 3.0 slots. I could go PLX 16x/16x on Z77 or native 16x/16x 3.0 on X79. Not sure if there would be much of a difference there but I bet there would be a large difference versus 8x/8x native Z77.
> Haha I don't know if I am a displayphile or just touched in the head.
> 
> 
> 
> 
> 
> 
> 
> 
> I ordered one of these bad boys that will be able to run 5x1 portrait catleap's or 3x1 landscape catleap's de-bezeled:


Vega's such a baller he bought ice cube!


----------



## TxWolf

Quick question, because I've been watching this thread for a few months.

I've got an older gaming rig x58 mobo with i7-950.

CPU: Intel Core i7 950 @ 3.07GHz
RAM: 6.00 GB Triple-Channel DDR3
MOBO: MSI Big Bang-XPower (MS-7666)
Display: ASUS VG278H ([email protected])
GPU: Gigabyte GV N580UD-3GI Graphics card - 3 GB - GDDR5 SDRAM

Is the general input from this group to purchase another GTX 580 such as an EVGA 580 3GB and use "The Mod" (using an Antec 620) for inexpensive effective water cooling for SLI.

Or consider purchasing a 690?

The reason why I ask is that I wanted to purchase the GK110 series, but that won't be out until end of 2012, and I want to fully utilize my 120Hz ASUS VG278H display at full settings.

Cost for 580 will be about $500.
Cost for 690 will be $1000 minus $300-$400 for selling my current Gigabyte 580 3gb.


----------



## stren

Quote:


> Originally Posted by *TxWolf*
> 
> Quick question, because I've been watching this thread for a few months.
> I've got an older gaming rig x58 mobo with i7-950.
> CPU: Intel Core i7 950 @ 3.07GHz
> RAM: 6.00 GB Triple-Channel DDR3
> MOBO: MSI Big Bang-XPower (MS-7666)
> Display: ASUS VG278H ([email protected])
> GPU: Gigabyte GV N580UD-3GI Graphics card - 3 GB - GDDR5 SDRAM
> Is the general input from this group to purchase another GTX 580 such as an EVGA 580 3GB and use "The Mod" (using an Antec 620) for inexpensive effective water cooling for SLI.
> Or consider purchasing a 690?
> The reason why I ask is that I wanted to purchase the GK110 series, but that won't be out until end of 2012, and I want to fully utilize my 120Hz ASUS VG278H display at full settings.
> Cost for 580 will be about $500.
> Cost for 690 will be $1000 minus $300-$400 for selling my current Gigabyte 580 3gb.


There's a guy selling zotac 580 3gb cards for 280 each on evga. That's what I would do. However check the benchmarks for your games at that res. The 690 would perform better of course but if you find 2 580's still not hitting enough fps you can always add a 3rd for cheap again. However at that point you're effectively paying more than two 670's.


----------



## TxWolf

No, not a fan of Zotac or Palits.

Originally chose the Gigabyte because I couldn't get an MSI Lightning Xtreme and the ASUS MARS 2 was too big.

Have always been of fan of EVGA, have used their cards for the past 5 years, but wanted something cool and quiet.

Now that I've incorporated "THE MOD" into my system, I can easily take a EVGA 580 3gb and cool and quiet it down to reasonable levels.

Anyone else have any input?


----------



## CallsignVega

Some pic's:

The panel is about 1/2 inch in thickness. Here is a pic comparing the size of the Catleap vs FW900:










Metal LCD panel retention bracket top corner:










Bottom corner:










Retention bracket removed, lit pixel at 2" mark to side just over 1/2 inch:










Top lit pixel line to edge where panel ribbon cables connect wrap over to connect to PCB just over 1/2 inch:










The bottom LED bar nemesis that sticks out almost 1 full inch:










Bottom right corner as installed in housing:










Testing 3x 2B catleap's in portrait (notice how small my trackball, G13 and keyboard are):










Man I forgot how big 27-30" screens are in portrait! I may have to try and get two more just to see the OMG factor of five of these in portrait.







Obviously the bezel gaps would be much smaller once I gave it the Vega treatment! The left side of each panel would overlap the large LED bar of the panel next to it, so each screen would have a slit angle to it. Not a huge deal with IPS viewing angles. Once properly done, the bezel gaps would be around .55 of an inch.


----------



## iCrap

So now you are back to LCD? Will this be a repeat of that 5x LCD build except this time with overclocked IPS catleaps, and fresnel lenses?


----------



## Samurai Batgirl




----------



## CallsignVega

Quote:


> Originally Posted by *iCrap*
> 
> So now you are back to LCD? Will this be a repeat of that 5x LCD build except this time with overclocked IPS catleaps, and fresnel lenses?


It would be a 5x1 portrait setup next to a FW900 so I could switch between the two. No Fresnel's.


----------



## iCrap

Quote:


> Originally Posted by *CallsignVega*
> 
> It would be a 5x1 portrait setup next to a FW900 so I could switch between the two. No Fresnel's.


Next to a single FW900? And would you run it all of a single machine, or go with two?


----------



## stren

Quote:


> Originally Posted by *CallsignVega*
> 
> It would be a 5x1 portrait setup next to a FW900 so I could switch between the two. No Fresnel's.


So how's the initial impression compared to your old samsung 120hz 1080p setup? Extra resolution is helpful? IPS vs TN? How's the performance? I assume you only have a few 680s to test with now?


----------



## CallsignVega

Quote:


> Originally Posted by *iCrap*
> 
> Next to a single FW900? And would you run it all of a single machine, or go with two?


Yes next to a single FW900. Not sure yet if I will get creative with a KVM switch if running 2x 7990's or just run two machines.

Quote:


> Originally Posted by *stren*
> 
> So how's the initial impression compared to your old samsung 120hz 1080p setup? Extra resolution is helpful? IPS vs TN? How's the performance? I assume you only have a few 680s to test with now?


Both have pro's and con's. The extra resolution and IPS viewing angles are real nice. Although you need almost double the GPU power as 1440P is almost double the pixels count as 1080P screens. The TN's screen required quite an angle in the 5x1 setup, not so much on the 3x1. The IPS would definitely be better in the 5x1 regard for the viewing angle. The largest limitation will be the DP>DVI adapters which will only allow the catleap's to run at 81 or 82 Hz. I have one of those inbound to test to see if they too can be overclocked LOL.

2x 680's isn't enough power to run three of these screens, really 3x is min and 4x optimal. 2x 7990's should be able to run five of them as long as you turn some settings down and no MSAA, possibly FXAA if those AMD cards can even run FXAA. I still have some of the Samsung S23A750D's so running five of them in portrait isn't out of the question. The only reason I went away from those is that the only thing that could run the setup was a 6990 which wasn't that good IMO. 7990's should be much better for the task.

Plus I have a Gigabyte Sniper 3 inbound which will allow 16x/16x 3.0 with the PLX chip so two 7990's should have plenty of room to breath.

Here is a short video of Skyrim testing the Catleap's in portrait Surround at 100Hz:

http://www.youtube.com/watch?v=pAz5yx5V9hY


----------



## iCrap

What refresh rate are you running the catleaps at? Also, have you ever thought of doing 6 monitors, (3x3) in horizontal?


----------



## CallsignVega

Quote:


> Originally Posted by *iCrap*
> 
> What refresh rate are you running the catleaps at? Also, have you ever thought of doing 6 monitors, (3x3) in horizontal?


In that video I was running all three at 100 Hz. I would never consider 3x2 (6), that would put a bezel right smack in the middle. The only other thing usable above 5x1 that has clear center monitor is 3x3 (9) setup. 3x1 is quite a strain on the GPU's, 5x1 will barely be doable with quad-fire 7990's. If you saw how huge this display setup is in your face, 5x1 will be OMG.


----------



## armartins

How many mms you can trim down removing the Metal LCD panel retention bracket? I'm considering doing it on my U2412Ms here...


----------



## CallsignVega

A good 3mm. Every mm counts when it comes to bezel minimization.


----------



## feteru

Damn, this is ridiculous. You should have your 3xFW900 Fresnel Lens setup next to a 5x1 Catleap. Any particular reason you got rid of the FW900's? Might have missed that post.


----------



## CallsignVega

I still have them. I have like 10 displays in my house, it's a sickness.


----------



## CallsignVega

This is one of my favorite motherboards ever. Doing some setup:


----------



## Nocturin

Quote:


> Originally Posted by *CallsignVega*
> 
> I still have them. I have like 10 displays in my house, it's a sickness.


I can take a few "off" of your hands







.

Looking forward to see these all naked n junk. Let it all hang out!

Can you show some pictures of the back? I wanna see how the PCB's and VESA mounting stuff is located.


----------



## ceteris

LOL what are you going to do with your Max V Gene?


----------



## CallsignVega

Quote:


> Originally Posted by *ceteris*
> 
> LOL what are you going to do with your Max V Gene?


Sell it.


----------



## ganganputput

Hi vega, I just bought 3x Samsung Syncmaster SA850D (7860x1440 total res) monitors and am running them in landscape surround from one gtx 680. i realise i have to upgrade my mobo and go multi gpu to run these monitors, currently on a i5-760 at 4.2g and p7p55d e pro motherboard. Whats your recommendation for mobo, cpu and number of SLI i should get to run these monitors well? i realise you did much testing on pcie 2.0 and 3.0 with different speeds and SLI configs, and im trying to make sense of it all. Basically i want good fps in BF3 like around the 60-90 fps range. and not slow down. any setup that you could recommend?


----------



## derickwm

How you liking those Koolance ram blocks? Thinking about picking them up myself for the next build.


----------



## CallsignVega

Quote:


> Originally Posted by *ganganputput*
> 
> Hi vega, I just bought 3x Samsung Syncmaster SA850D (7860x1440 total res) monitors and am running them in landscape surround from one gtx 680. i realise i have to upgrade my mobo and go multi gpu to run these monitors, currently on a i5-760 at 4.2g and p7p55d e pro motherboard. Whats your recommendation for mobo, cpu and number of SLI i should get to run these monitors well? i realise you did much testing on pcie 2.0 and 3.0 with different speeds and SLI configs, and im trying to make sense of it all. Basically i want good fps in BF3 like around the 60-90 fps range. and not slow down. any setup that you could recommend?


I've found that two overclocked GTX 680's was not enough to run my 3x 1440P Catleap's properly. I'd say 3x would be a safe bet. You really can't run any MSAA though and will need to turn down some settings, unless you have the 4GB 680's. If you are looking for a new system to run that setup, you will be hard pressed to get any better than a 3770K and Gigabyte Sniper 3 like I am running. Awesome board for the money, I don't know how Gigabyte pulled it off.

Quote:


> Originally Posted by *derickwm*
> 
> How you liking those Koolance ram blocks? Thinking about picking them up myself for the next build.


The Koolance RAM block's have served me well for a couple of years worth of builds now. They keep the ram nice and cool and haven't leaked a single drop of liquid.







Although they can be a bit tough to get one the ram sticks (I apply the thermal tape to the RAM sticks and then heat the RAM blocks up with a heat gun nice and hot and slide 'em on).


----------



## ganganputput

Quote:


> Originally Posted by *CallsignVega*
> 
> I've found that two overclocked GTX 680's was not enough to run my 3x 1440P Catleap's properly. I'd say 3x would be a safe bet. You really can't run any MSAA though and will need to turn down some settings, unless you have the 4GB 680's. If you are looking for a new system to run that setup, you will be hard pressed to get any better than a 3770K and Gigabyte Sniper 3 like I am running. Awesome board for the money, I don't know how Gigabyte pulled it off.


Hi Vega, thanks for your reply. would like to know what you think of the i5-3570k vs i7-3770k. if im not wrong games dont use HT much? or negligble? think the i5 is also slightly cheaper? just wanna tap your experience handling multiple boards/chips/cards









yes i like the board too, its nice and green will match the custom fluoroscent green colour internals of my current asus vento 3600 case.

oh, btw, since u are switching to the gigabyte sniper 3, are there mobo blocks and chipset blocks for that or r you sticking to air cooling?

ive become a fan of your thread and have been reading every post









edit: wow theres this new asrock extreme11 board that gives x16x16x16x16!! oh my good drool! couple that with 4x 4gb 680s and i can run max settings on any game and stuff on my 3 monitors. ")


----------



## africanos23

Hey Vegas, I just wanted to ask because a friends doing a build with 3 vga only monitors can he run 2d surround on tri sli 680s with dvi-vga connectors ?


----------



## CallsignVega

Quote:


> Originally Posted by *ganganputput*
> 
> Hi Vega, thanks for your reply. would like to know what you think of the i5-3570k vs i7-3770k. if im not wrong games dont use HT much? or negligble? think the i5 is also slightly cheaper? just wanna tap your experience handling multiple boards/chips/cards
> 
> 
> 
> 
> 
> 
> 
> 
> yes i like the board too, its nice and green will match the custom fluoroscent green colour internals of my current asus vento 3600 case.
> oh, btw, since u are switching to the gigabyte sniper 3, are there mobo blocks and chipset blocks for that or r you sticking to air cooling?
> ive become a fan of your thread and have been reading every post
> 
> 
> 
> 
> 
> 
> 
> 
> edit: wow theres this new asrock extreme11 board that gives x16x16x16x16!! oh my good drool! couple that with 4x 4gb 680s and i can run max settings on any game and stuff on my 3 monitors. ")


3570K would be fine. I disable HT on my 3770K anyway. I just bought the 3770K just for the slightly larger L2 cache as it was only a few dollars more. I haven't seen a water block yet but I heard there was a rumor that one might come out. For now I just have a 200mm fan blowing down on the whole motherboard.

Ya, the Extreme 11 X79 will be an awesome board if and when it releases. That is the board I would go to if I go back to 4-way SLI in large Surround setups. 4x 4GB EVGA Classified 680's and the Extreme 11 would be about the best setup you could possibly have. Unfortunately NVIDIA Surround only allows a max of three displays, so I will be dabbling with two AMD 7990's and five overclocked catleap's in portrait Eyefinity (or possibly re-doing my 5x1 Samsung 23" 120Hz setup if the overclock catleap's aren't produced anymore and/or the 5x catleap's are just too tough to run even for 2x watercooled and highly overclocked 7990's in quad cross-fire). The two 7990's will each have their own 16x 3.0 slot on the Sniper 3 so it should work out well. That PLX chip is going to get a workout pushing 18.4 million pixels.









Quote:


> Originally Posted by *africanos23*
> 
> Hey Vegas, I just wanted to ask because a friends doing a build with 3 vga only monitors can he run 2d surround on tri sli 680s with dvi-vga connectors ?


A GTX 680 can only handle one VGA monitor at a time. He would need three GTX 680's or two GTX 690's. So you should be good with either of those combo's.


----------



## africanos23

Thanks!


----------



## CallsignVega

Great news about the Accell 330MHz DP to DVI adapter: http://www.accellcables.com/products/DisplayPort/DP/dp330_dvid.htm

They must have used a nice chip in here as I can overclock this bad boy to ~425 MHz pixel clock, or about 108 Hz on the Catleap. It's good to know that I can make one of my monster Eyefinity setups now and be at pretty good refresh rate.









Now all I need is one more 2B Catleap monitor, three more of these adapters and 2x 7990's and my 18.4 million pixel de-bezeled 5x1 portrait Eyefinity build of ownage can commence.


----------



## Nocturin




----------



## TheBadBull

You never really limit yourself, do you?


----------



## kakee

Quote:


> Originally Posted by *TheBadBull*
> 
> You never really limit yourself, do you?


Shhh

Better for everyone, no limits


----------



## RussianHak

I just crapped myself.


----------



## TheBadBull

Quote:


> Originally Posted by *kakee*
> 
> Quote:
> 
> 
> 
> Originally Posted by *TheBadBull*
> 
> You never really limit yourself, do you?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Shhh
> 
> Better for everyone, no limits
Click to expand...


----------



## SI51

Do you have any detailed information on how you set up the fresnal lens? And what it does exactly?

Thanks,
SI51


----------



## pahoran

hey vega long time reader and amazed by this extreme setup. Been readin since the beginning but lost track a week ago. Why did you go Ivy w/ the Maximus V Gene, instead of the Gigabyte Sniper x79 and/or RIVE x79?

Im jumping from z68 i5 2500k to ga-x79-ud3 and i7 3820. and would like your valued input. I will only be running one shimian 27" monitor 1440p, but waayyyy later on down the road will like to run a 4way sli or crossfire. the ga-x79-ud3 has 4 pci 3.0 slots and its not extremely expensive. I really dont get the PXL chip thing between z77 and x79 so i will do a bit more research on that.

I dont know if I should go Ivy or Sandy-E.

Thank You! and btw love your youtube vids!


----------



## CallsignVega

I am running the Sniper 3 now. It's like my 3rd major config change in three months.









If you are going 4 way SLI down the road (if running high resolution Surround) I'd maybe wait for the ASRock Extreme 11 (2x PLX chips for quad 16x PCI-E 3.0). That's what I'd get if I were going to do like four Big Kepler's and/or four 7970's with high-resolution Eyefinity/Surround. As a matter of fact that's why I'm holding onto my 3960X just in case something interesting comes out of Computex next week. Maybe some 7970's with 6x mini-DP that I could 4-way Crossfire it up on the Extreme 11. If the 7990 turns out sweet I'll get two of those and toss them in the two 16x slots on the Sniper 3 and call it a day. Well, until I realize that AMD has once again bungled their dual-GPU card drivers and I have to revert to something else.


----------



## zdude

lol, i would have thought that you would have found a driver hack that lets you run a 5-way surround setup by now vega, you disapoint me


----------



## CallsignVega

Quote:


> Originally Posted by *zdude*
> 
> lol, i would have thought that you would have found a driver hack that lets you run a 5-way surround setup by now vega, you disapoint me


AMD and NVIDIA cannot get their own drivers to work properly in mutli-display setups, let alone me.


----------



## feteru

Quote:


> Originally Posted by *CallsignVega*
> 
> AMD and NVIDIA cannot get their own drivers to work properly in mutli-display setups, let alone me.


You should try something like this. Not sure if it would still work, or if it was ever any good, but you may have enough displays.


----------



## De-Zant

Quote:


> Originally Posted by *feteru*
> 
> Quote:
> 
> 
> 
> Originally Posted by *CallsignVega*
> 
> AMD and NVIDIA cannot get their own drivers to work properly in mutli-display setups, let alone me.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> You should try something like this. Not sure if it would still work, or if it was ever any good, but you may have enough displays.
Click to expand...

That's duniek. He's gone past just 9 displays. He's had 15 at one point.

http://www.youtube.com/watch?v=2ukOb4tiQQI&feature=channel&list=UL

He used a combination of eyefinity, softTH, and triplehead2go IIRC. The result is a setup that is impossible to run at decent frame rates, has bezels everywhere, can only use DX9, and requires more effort to upkeep properly.

And yes, it's still possible if one wants to do it.


----------



## CallsignVega

Quote:


> Originally Posted by *feteru*
> 
> You should try something like this. Not sure if it would still work, or if it was ever any good, but you may have enough displays.


No thanks, the gameplay experience would be quite poor.


----------



## feteru

Quote:


> Originally Posted by *De-Zant*
> 
> That's duniek. He's gone past just 9 displays. He's had 15 at one point.
> http://www.youtube.com/watch?v=2ukOb4tiQQI&feature=channel&list=UL
> He used a combination of eyefinity, softTH, and triplehead2go IIRC. The result is a setup that is impossible to run at decent frame rates, has bezels everywhere, can only use DX9, and requires more effort to upkeep properly.
> And yes, it's still possible if one wants to do it.


That is impressive, and the bezels and color correction seem to be the worst parts of that. I do still have to wonder how it would be with bezels removed, and good displays, mayhaps the gameplay experience would be better.


----------



## armartins

Any news Vega? What about some pics or videos of 3 x portrait catleap debezeled, I know you already did it lol. I'm wondering how it would work on 2-3 680s while you can't do the 5x... well you can with one asus DCII and 3 other 7970s at your RIVE build... I guess the Lightning 7970 wouldn't be an option since both DVI are single link only and I guess there isn't an active single link DVI to dual link DVI adpter... One asus DCII (has waterblock avaliable), 4 of these DP-> DVI dual link adapters and 3 other 7970s and you'd be fine... It would be nice to know if the crossfire bridge would be a limitating factor on this extreme scenario... also if micro stuttering would be a problem..


----------



## CallsignVega

If I go 2x 7990's the Sniper 3 MB will be good to go with 2x 3.0 slots. That's if it has the same 4x mini-DP + 1x DL DVI port like the 6990, or even better yet 5-6 mini-DP.

Waiting to see what comes out of Computex. I talked to ASRock and the Extreme11 MB is a go. So if someone comes out with "Eyefinity 6" type 7970's I'll get four of those to run 5x1 2B Catleap's, or maybe even 4x 680 Classified's to run 3x1 2B Catleap's if AMD and their partners screw the pooch on proper output's.

The Asus DCII 7970 is no good, when all 4 DP are used there is only 1 single link DVI left. Poor design.


----------



## armartins

I guess you mean there are only 2 single link DVI, right? At least my 6970 DCII works like that if the switch is in the "split" position I don't recall if 0 or 1 now. DP works as well as the DVI output (I can only test single link resolutions since my monitors are 1200P), If I change the position the DVI still works (probably in dual link) and the DP isn't even recognized it's disabled. I always thought the other DVI was dual link all time so you could hook up your 4 adaptors on the DPs and the fifth monitor at the non shared DVI output. But you're probably right since ASUS site says only 1 DVI dual link.


----------



## Flam3h

Hi Vega,

I was wondering if you could help me out.

Currently trying to build a rig that will hold at 100FPS on my Catleap 2B.

I was going to go for a x79 Extreme IV Rampage, 3930k and 3x 680's (2GB Signature SC's). I currently have a CM Cosmos II case to house everything.

I have already purchased the 3930k, but I am willing to sell it, if I am better off with z77 sniper 3 setup.

Questions.

Did you manage to get the x79 Rampage Extreme working in PCI 3.0 with the registry fix even on the latest drivers with no boot/game crashes? (I don't want to be limited to PCI 2.0 8x on my cards).

Is there any other issues that made you change to z77 with the sniper 3?. Will I be ok keeping with the X79 platform for my 3x 680s or do you recommend I sell the 3930k, halt my purchases of the Rampage and 680's and change?.

I was looking at the 690's as I wouldn't have the issues with PCI 3.0. But for one, they are sold out (UK) and two, 2x 690s is a lot of cash and overkill for 1 Catleap 2B (where 1x 690 maybe not enough) - where as I think 3x 670's / 680's would perfect for the FPS I'm after at 1440p.

Any advise would be greatly appreciated.


----------



## thestache

Quote:


> Originally Posted by *Flam3h*
> 
> Hi Vega,
> I was wondering if you could help me out.
> Currently trying to build a rig that will hold at 100FPS on my Catleap 2B.
> I was going to go for a x79 Extreme IV Rampage, 3930k and 3x 680's (2GB Signature SC's). I currently have a CM Cosmos II case to house everything.
> I have already purchased the 3930k, but I am willing to sell it, if I am better off with z77 sniper 3 setup.
> Questions.
> Did you manage to get the x79 Rampage Extreme working in PCI 3.0 with the registry fix even on the latest drivers with no boot/game crashes? (I don't want to be limited to PCI 2.0 8x on my cards).
> Is there any other issues that made you change to z77 with the sniper 3?. Will I be ok keeping with the X79 platform for my 3x 680s or do you recommend I sell the 3930k, halt my purchases of the Rampage and 680's and change?.
> I was looking at the 690's as I wouldn't have the issues with PCI 3.0. But for one, they are sold out (UK) and two, 2x 690s is a lot of cash and overkill for 1 Catleap 2B (where 1x 690 maybe not enough) - where as I think 3x 670's / 680's would perfect for the FPS I'm after at 1440p.
> Any advise would be greatly appreciated.


You realize that with the way sandy bridge-e works and ivy bridge works that if you were stuck at PCIe 2.0 16x/8x/8x on a X79 board that it would be the same as running PCIe 3.0 8x/4x/4x on a Z77 board yeah?

Ivy bridge only has 16x PCIe lanes to work with whilst sandy bridge-e has 32x PCIe lanes to work with.

I would do everything possible to keep the 3930k CPU (since it's much better than any ivy bridge chip) and either just enjoy it or try and get the sandy bridge-e system to work at PCIe 3.0 speeds. This of course is up to Nvidia's drivers at this point since I haven't heard anything on them re-enabling PCIe 3.0 on sandy bridge-e systems since release? PLX chips on the Z77 chipset are nice and such but don't address the actual issue at hand. Seems Nvidia are buying into Intels bull**** and not enabling it because X79 isn't offically PCIe 3.0 although it works perfectly fine at those speeds.

So consider doing the un-thinkable and choosing AMD if it's that important to you 'right now' because the HD 7970s seem to work fine at PCIe 3.0 speeds on X79 platforms.

Either downgrade to a ivy bridge platform so you can run native, supported and hassle free PCIe 3.0 at 16x with a GTX 690 (on something like a sabertooth board because spending money on a Z77 platform with limited bandwidth is waste of money, I know, I have an ASUS WS Revolution board and it does not solve the issue) or stick with sandy bridge-e (and upgrade to ivy bridge-e at the end of the year) and run 3x SLI on a X79 board which esentially will give you the same performance as 3x SLI on a Z77 system. More if you can get it to work at the faster speeds.


----------



## Flam3h

Quote:


> Originally Posted by *Sexparty*
> 
> You realize that with the way sandy bridge-e works and ivy bridge works that if you were stuck at PCIe 2.0 16x/8x/8x on a X79 board that it would be the same as running PCIe 3.0 8x/4x/4x on a Z77 board yeah?
> Ivy bridge only has 16x PCIe lanes to work with whilst sandy bridge-e has 32x PCIe lanes to work with.
> I would do everything possible to keep the 3930k CPU (since it's much better than any ivy bridge chip) and either just enjoy it or try and get the sandy bridge-e system to work at PCIe 3.0 speeds. This of course is up to Nvidia's drivers at this point since I haven't heard anything on them re-enabling PCIe 3.0 on sandy bridge-e systems since release? PLX chips on the Z77 chipset are nice and such but don't address the actual issue at hand. Seems Nvidia are buying into Intels bull**** and not enabling it because X79 isn't offically PCIe 3.0 although it works perfectly fine at those speeds.
> So consider doing the un-thinkable and choosing AMD if it's that important to you 'right now' because the HD 7970s seem to work fine at PCIe 3.0 speeds on X79 platforms.
> Either downgrade to a ivy bridge platform so you can run native, supported and hassle free PCIe 3.0 at 16x with a GTX 690 (on something like a sabertooth board because spending money on a Z77 platform with limited bandwidth is waste of money, I know, I have an ASUS WS Revolution board and it does not solve the issue) or stick with sandy bridge-e (and upgrade to ivy bridge-e at the end of the year) and run 3x SLI on a X79 board which esentially will give you the same performance as 3x SLI on a Z77 system. More if you can get it to work at the faster speeds.


My main concerns are of the wasted GPU power of running 3x 680's on the x79, with only being able to get PCI-e 2.0 8x speeds on 2 of the cards.

What I really want someone to say is 'Yes, you can enable PCI-e 3.0 on the x79 Rampage IV Extreme using the registry fix with no problems'.

My worries are that if I purchase it and I cannot enable PCI-e 3.0 using the registry fix (crashes/boot issues). Then I'm stuck with PCI 2.0 8x speeds, which will limit two of the cards as Vega has shown in his youtube videos.

I'm comparing the z77 platform (Sniper 3), which can run 16x 8x 8x PCI-e 3.0 (yes I know about the lanes limit, but probably a lot better than PCI-e 2.0 8x limitations). I will change to this platform if it really is going to benefit me to get the most from my 3 cards.

I am not overly optimistic about Nvidia actually enabling PCI-e 3.0 on x79 officially. I also don't want to gamble on IB-e (in case it never comes and even if it did I really want to make a decision now based on my GFX card purchases and current tech).

I'm curious if Vega could share some light, as I 'think' he may have used both systems with multiple cards.

The AMD series cards are of no use, as I'm using a Catleap 2B (1440p, 100hz). Currently with Crossfire, as far as I know, you cannot get over 85hz at 1440p of which I want 100hz (so Nvidia only).

The thing that is puzzling me is Vega mentioned he couldn't enable PCI-e 3.0 using registry fix without boot problems in one post. But on another (and in videos), he is using the registry fix and is using 301.10 (which didn't have 3.0 support natively). So am I to assume, he managed to correct the registry fix to make it work with newer drivers?.


----------



## Nocturin

Look for vega's PCIe comparison that he did with his X79 board a month or two ago, he explains his processes and it might assist you for the questions that you have.


----------



## Flam3h

Quote:


> Originally Posted by *Nocturin*
> 
> Look for vega's PCIe comparison that he did with his X79 board a month or two ago, he explains his processes and it might assist you for the questions that you have.


Thanks,

Just had a quick flick through the whole thread. Still can't find any real answers.

On Overclockers.co.uk he states:

"On another note: Well I tried the registry fix to try and get my GTX 680's running at PCI-E 3.0 with my X79 MB and in Surround mode Windows fails to boot. So the article was right that there can be issues using that registry key."

Then on here:

"I love this new nVIdia Surround. It keeps the desktop task-bar only on the center monitor and when I maximize windows it only maximizes on the center screen. Awesome features! With the simple registry edit, I've got all of the cards running at PCI-E 3.0. In the BIOS I can switch between 1.0/2.0/3.0 at will so this will make for some nice tests."

'(4) EVGA GTX 680's running 1191MHz core, 3402 MHz Memory
nVidia Driver 301.10 with PCI-E 3.0 registry adjustment turned on and off for each applicable test
GPU-Z 0.6.0'

It's interesting.

I did read somewhere that an Asus Support Rep mentioned there are issues with PCI-e 3.0 support on the Nvidia 600 cards when the BLK is over 105mhz. I'm reading mixed reports about people getting it working with some unable to boot into Windows or they can but benchmark/games crash (maybe they have their BLK over 105mhz?). Others seem to be able to make it work ok. There really isn't a great deal of information on it and it's really giving me a headache trying to make a decision (with a lot of money involved).

I'm also interested why Vega switched to z77 platform if he indeed had the x79 working fine with all 4 GPU's on native PCI-e 3.0. Hopefully he will pop in and help me clear my mind.


----------



## CallsignVega

Yes, the Rampage IV Extreme worked perfectly fine with the 3.0 registry fix. That original problem I had was Surround related and not the 3.0 change. Whether NVIDIA retains the registry fix option in future drivers is up in the air.

If you will be running a single Catleap, 2x GTX 680's should be fine. That is what I am currently running on my Sniper 3 until I get some news on the 7990 and 680 Classified's. Since if I go with four cards and 3-5 Catleap's, I would get the Extreme11 X79 that can do all slots at 16x. If I go with 2x 7990's, the Sniper 3 will be good.

For one Catleap X79 really isn't needed unless you must go 3-4 GPUs. And as far as the 2B Catleap is concerned, I'd go AMD Crossfire as they can do 120+ Hz that ToastX recently discovered. NVIDIA is limited to 102 Hz in SLI.


----------



## ganganputput

Hi vega, whilst waiting for the asrock extreme 11 to launch and reach singapore for sale, i was browsing for methods to go bezeless, and i came across this.

http://www.youtube.com/watch?annotation_id=annotation_29744&src_vid=dOY2lREuwjU&feature=iv&v=nVKGr_J1gF0

its surround and there is a company selling the software to do this. any thoughts? others do chip in too, wanna know whats this tech really about, and i wonder what res can projectors go up to. not sure if projectors have lag?

if its really good it would be an ultimate setup for gaming, watching movies, and create an entertainment room at home.

please do share your thoughts


----------



## CallsignVega

Quote:


> Originally Posted by *ganganputput*
> 
> Hi vega, whilst waiting for the asrock extreme 11 to launch and reach singapore for sale, i was browsing for methods to go bezeless, and i came across this.
> http://www.youtube.com/watch?annotation_id=annotation_29744&src_vid=dOY2lREuwjU&feature=iv&v=nVKGr_J1gF0
> its surround and there is a company selling the software to do this. any thoughts? others do chip in too, wanna know whats this tech really about, and i wonder what res can projectors go up to. not sure if projectors have lag?
> if its really good it would be an ultimate setup for gaming, watching movies, and create an entertainment room at home.
> please do share your thoughts


Multi-projector curved screen setups have a slew of problems. You can ask I88Bastard here for details. Stuff looks cool in video's but you don't get to see all of the negatives. I mean the list of negatives is just massive.

From focus issue due to the curved screen, edge blending software not working properly with games, to the massive heat and noise of many projectors above your head, to the input lag and slow response times of virtually all projectors, to the many $400 projector bulbs burning out every 1000-2000 hours, to being locked at crappy 60Hz unless you go super low resolution... the list goes on and on. I did a ton of research on a multi-projector curved screen setup. They are very rare because they never turn out as good in practice as thought of in theory.

Did you also notice the performance of whatever machine was running that demo? He was getting like 10 FPS, if that. Would be completely unplayable. I also love when people say "FULL HD"?!?! lol like 1080P @ 60Hz is something to gloat about.


----------



## ganganputput

im really thankful for your sharing, im glad you shared your know how and previous research and save us from trying out/wasting our hard earned money on these alternatives









By the way i heard the asrock 11 will be due soon, waiting for it to be released n priced









oh yes, i managed to find a supplier for 4GB zotac 670s and 680s in singapore, the supplier is now quoting the price. things are progressing well for my 7860x1440 res.

if the asrock 11 is priced too high with the i7-3820, then ill settle for G1 sniper 3 with quad pcie3 x8 instead of the quad pcie3 x16. just hope that when i buy and run the quad 4GB 670s or 680s, i wont suffer from fps limitation due to the X8 vs x16 speed.


----------



## Flam3h

Quote:


> Originally Posted by *CallsignVega*
> 
> Yes, the Rampage IV Extreme worked perfectly fine with the 3.0 registry fix. That original problem I had was Surround related and not the 3.0 change. Whether NVIDIA retains the registry fix option in future drivers is up in the air.
> If you will be running a single Catleap, 2x GTX 680's should be fine. That is what I am currently running on my Sniper 3 until I get some news on the 7990 and 680 Classified's. Since if I go with four cards and 3-5 Catleap's, I would get the Extreme11 X79 that can do all slots at 16x. If I go with 2x 7990's, the Sniper 3 will be good.
> For one Catleap X79 really isn't needed unless you must go 3-4 GPUs. And as far as the 2B Catleap is concerned, I'd go AMD Crossfire as they can do 120+ Hz that ToastX recently discovered. NVIDIA is limited to 102 Hz in SLI.


Thanks for the reply Vega.

With that in mind I think I will continue with the x79 build.







.

You mentioned about ToastX, that's very interesting (looked on 120hz.net). It seems there are side effects with it and a 'possibility' of more issues with future drivers.

I'm going to stick with 3x 680's (2gb's should be fine @ 1440p), as I really want a good solid 100FPS.

Think I'm happy using Nvidia on the Catleap. One reason is because 100hz won't stress the monitor like 120hz will do (long term longetivity in mind). And two, I love Vsync and it will be less taxing to get 100 FPS in games than @ 120FPS with all the nice settings enabled.

I will post on here just to say how I got on when it's finally built.

BTW, was there any reason you chose to swap to the z77 Sniper when you had x79 working with PCI 3.0 on your quad setup?. Was it just for the CPU?, or you wanted to see if the PLX made any difference?.


----------



## Wattser93

Quote:


> Originally Posted by *CallsignVega*
> 
> Multi-projector curved screen setups have a slew of problems. You can ask I88Bastard here for details. Stuff looks cool in video's but you don't get to see all of the negatives. I mean the list of negatives is just massive.
> From focus issue due to the curved screen, edge blending software not working properly with games, to the massive heat and noise of many projectors above your head, to the input lag and slow response times of virtually all projectors, to the many $400 projector bulbs burning out every 1000-2000 hours, to being locked at crappy 60Hz unless you go super low resolution... the list goes on and on. I did a ton of research on a multi-projector curved screen setup. They are very rare because they never turn out as good in practice as thought of in theory.
> Did you also notice the performance of whatever machine was running that demo? He was getting like 10 FPS, if that. Would be completely unplayable. *I also love when people say "FULL HD"?!?! lol like 1080P @ 60Hz is something to gloat about.*


Lol. Consoles still render a lot of games at 720P @ 30Hz.


----------



## l88bastar

I love me a biggem projector.... I love how movies are larger than life and 3rd person games like GTAIV really pop.... I am currently in love with 3D DLP projectors, but thats an entirely different topic......Anyway....multi-projector setups are an entirely different animal versus multi-monitor setups and both have their pros and cons.

As you can see here...I rocked a MASSIVE dual 1080p projector setup that was very impressive.
http://www.youtube.com/watch?v=ApfvK9f9eMI

If you are primariliy into serious flight simulators or racing games then multi-projector are gonna be perfect for you because the lifesize screens really enhance your immersion. However, if you are into FPS games then you might want to think twice. I have found that large screens make me very motion sick with FPS games because of the constant panning, turning, zooming, commotion etc. You see flight sims and driving sims have more natural rolling motions which are easier for your brain to process, but the hectic crazy pace of FPS games will make you dizzy...at least they made me dizzy....so yea if your gonna be playing bf3 on a super duper large 15' screen then here is something for you








http://www.youtube.com/watch?v=ouDDj6kr1qo

I sold my multi-projector setup when I got my triple Samsung S27A750 120hz 1080p monitor setup and do not regret the decision at all.

S27A750s in landscape

















S27A750s in portrait


----------



## VettePilot

@l88bastar

how do you run portrait with those Samsungs? I thought they had no VESA mounts and it had a fixed base with no adjustment in height? I love the look of those and I already tried the BenQ 2420 and did no tlike it.


----------



## iCrap

I had thought of doing a multiprojector setup at one point but i never got around to it. Im curious though, what are the problems with a multiprojector setup?


----------



## Nocturin

I would much rather have a 4K projector and call it a day







.


----------



## l88bastar

Quote:


> Originally Posted by *Topgearfan*
> 
> @l88bastar
> how do you run portrait with those Samsungs? I thought they had no VESA mounts and it had a fixed base with no adjustment in height? I love the look of those and I already tried the BenQ 2420 and did no tlike it.


Lol did you really just ask that question in a Vega thread? Especially after you saw my video of a custom 4'x12' curved double projector screen? REALLY?

Just because something is not "designed" to do something doesn't mean it can't be done. Where there is a will there is a way.

Anyway....since you did ask nicely I will share my method with you:


----------



## Nocturin

Quote:


> Originally Posted by *l88bastar*
> 
> Lol did you really just ask that question in a Vega thread? Especially after you saw my video of a custom 4'x12' curved double projector screen? REALLY?
> Just because something is not "designed" to do something doesn't mean it can't be done. Where there is a will there is a way.
> Anyway....since you did ask nicely I will share my method with you:


wait.

I missed something, those monitors aren't VESA complaint?

I am disappoint if this is the case.


----------



## CallsignVega

Quote:


> Originally Posted by *l88bastar*
> 
> Anyway....since you did ask nicely I will share my method with you:


I88 Bastard prefer's Zephyr Hills. I prefer Poland Spring in my multi-monitor setups.


----------



## CallsignVega

Quote:


> Originally Posted by *Flam3h*
> 
> Thanks for the reply Vega.
> With that in mind I think I will continue with the x79 build.
> 
> 
> 
> 
> 
> 
> 
> .
> You mentioned about ToastX, that's very interesting (looked on 120hz.net). It seems there are side effects with it and a 'possibility' of more issues with future drivers.
> I'm going to stick with 3x 680's (2gb's should be fine @ 1440p), as I really want a good solid 100FPS.
> Think I'm happy using Nvidia on the Catleap. One reason is because 100hz won't stress the monitor like 120hz will do (long term longetivity in mind). And two, I love Vsync and it will be less taxing to get 100 FPS in games than @ 120FPS with all the nice settings enabled.
> I will post on here just to say how I got on when it's finally built.
> BTW, was there any reason you chose to swap to the z77 Sniper when you had x79 working with PCI 3.0 on your quad setup?. Was it just for the CPU?, or you wanted to see if the PLX made any difference?.


I was only going to run 1-2 GPU's, so X79 wasn't needed. If I was going to go 3-4 GPU's I would upgrade to the Extreme11 X79 anyways. If I go 4x EVGA 680 Hydro Classified's which is very likely, I will just do the registry fix like everyone else. The 7990 seems to not exist, and if it did I'd only be able to get around 107Hz out of the Catleap's on them anyway due to no AMD card being able to do 3x DL DVI. The limit becomes the 107 Hz on the overclocking display port to DVI adapters. NVIDIA you can do 3x DL DVI natively but the limit is 102 Hz SLI. Granted, that 5Hz isn't going to mean much. Now if I could get 3+ Catleap's all at 120 Hz on AMD, then I'd reconsider.

EDIT: Screw that, 680's overclock *horribly* in SLI. Pretty soon I'll be back to near stock boost clocks these things keep crashing in SLI. I'll most likely be going 7970's again as they can run at a much higher frequency in crossfire without crashing and work really well at super high resolution Eyefinity.


----------



## ganganputput

these new 7970s are released, ines a 25% oc'ed version with 6 output ports,
http://vr-zone.com/computex/asus-shows-off-rog-matrix-7970-graphics-card/16157.html

and this oher one appears to be a dual 7970?
http://vr-zone.com/computex/instead-of-amd-s-radeon-hd-7990-powercolor-set-to-debut-hd-7970-x2-at-computex-/16062.html

could these fit the build for what you wanted to do?

im suprised to hear that 680s cant over clock well in sli. would this mean that if i run quad 670s 4GB i may not be able to boost clocks by say 5%? to bring it on par with a stock quad 680s 4GB?

i was hoping i could slight oc the quad 670 4gb cards to meet stock 680 4gb performance levels.


----------



## CallsignVega

Quote:


> Originally Posted by *ganganputput*
> 
> these new 7970s are released, ines a 25% oc'ed version with 6 output ports,
> http://vr-zone.com/computex/asus-shows-off-rog-matrix-7970-graphics-card/16157.html
> and this oher one appears to be a dual 7970?
> http://vr-zone.com/computex/instead-of-amd-s-radeon-hd-7990-powercolor-set-to-debut-hd-7970-x2-at-computex-/16062.html
> could these fit the build for what you wanted to do?
> im suprised to hear that 680s cant over clock well in sli. would this mean that if i run quad 670s 4GB i may not be able to boost clocks by say 5%? to bring it on par with a stock quad 680s 4GB?
> i was hoping i could slight oc the quad 670 4gb cards to meet stock 680 4gb performance levels.


Ya, if Asus squeezed a working dual link DVI port on there while all four display port slots are filled, I'd buy four in a heartbeat. But if it is like the Asus DCUII (which I'm 95% certain it is), it has the following limitations:



When the DVI port is used in dual link mode, it disables one of the display ports. The other DVI is always single link. That will be fine for three screen's but will be a no-go for my 5x1 2B Catleap project. I don't see why it's so hard for someone to come out with an "Eyefinity 6" type 7970 with 6 mini-DP.

The 670's will be able to clock a lot higher than stock 680's IMO. It's just compared to single GPU clocks, SLI 680 clocking I have found to be less than stellar.


----------



## ganganputput

Quote:


> Originally Posted by *l88bastar*
> 
> I love me a biggem projector.... I love how movies are larger than life and 3rd person games like GTAIV really pop.... I am currently in love with 3D DLP projectors, but thats an entirely different topic......Anyway....multi-projector setups are an entirely different animal versus multi-monitor setups and both have their pros and cons.
> As you can see here...I rocked a MASSIVE dual 1080p projector setup that was very impressive.
> http://www.youtube.com/watch?v=ApfvK9f9eMI
> If you are primariliy into serious flight simulators or racing games then multi-projector are gonna be perfect for you because the lifesize screens really enhance your immersion. However, if you are into FPS games then you might want to think twice. I have found that large screens make me very motion sick with FPS games because of the constant panning, turning, zooming, commotion etc. You see flight sims and driving sims have more natural rolling motions which are easier for your brain to process, but the hectic crazy pace of FPS games will make you dizzy...at least they made me dizzy....so yea if your gonna be playing bf3 on a super duper large 15' screen then here is something for you
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.youtube.com/watch?v=ouDDj6kr1qo
> I sold my multi-projector setup when I got my triple Samsung S27A750 120hz 1080p monitor setup and do not regret the decision at all.


that was a really cool projector setup, i guess i would be motion sick cause although i do play simulators i would also intend to play fps games too. and watch movies. in living room. was in the navy for 11 years and quit, motion sickness was usually overcome though patches or medication.

I'll stick to my tri samsung SA850D monitors







 and work on getting sufficient horsepower to run them. live with the bezels till the LG DM92 or some copycat model from monitor manufacturers release a close to no bezel version.

@vega . thought of stripping my monitors vega style but there arent any screw holes anywhere along the edges and couldnt pry open the case. thought id leave it in case i break tabs or mouldings which would void the warranty.


----------



## VettePilot

Quote:


> Originally Posted by *l88bastar*
> 
> Lol did you really just ask that question in a Vega thread? Especially after you saw my video of a custom 4'x12' curved double projector screen? REALLY?
> Just because something is not "designed" to do something doesn't mean it can't be done. Where there is a will there is a way.
> Anyway....since you did ask nicely I will share my method with you:


Honestly I I did not pay attention nor did I care to read every post in this thread to find out. The only video I saw was one with the projector and then someone saying that the other 3 monitors were Dells, so clearly that would not of told me what you were using. I figured you knew of some mounting option other than just proping them up with some water bottles. SInce that is the way you would have to do it I will pass on those monitors and wait for something better. I want to use some acutal vesa mount ergotron stands.


----------



## feteru

Quote:


> Originally Posted by *CallsignVega*
> 
> Ya, if Asus squeezed a working dual link DVI port on there while all four display port slots are filled, I'd buy four in a heartbeat. But if it is like the Asus DCUII (which I'm 95% certain it is), it has the following limitations:
> snip
> When the DVI port is used in dual link mode, it disables one of the display ports. The other DVI is always single link. That will be fine for three screen's but will be a no-go for my 5x1 2B Catleap project. I don't see why it's so hard for someone to come out with an "Eyefinity 6" type 7970 with 6 mini-DP.
> The 670's will be able to clock a lot higher than stock 680's IMO. It's just compared to single GPU clocks, SLI 680 clocking I have found to be less than stellar.


Why do you think the 670s will clock higher? The smaller PCB, or better designs. So what 670 do you think would be best to get, as I was planning on getting an EVGA FTW, seeing as it is basically a 680, but what do you think?


----------



## CallsignVega

Quote:


> Originally Posted by *feteru*
> 
> Why do you think the 670s will clock higher? The smaller PCB, or better designs. So what 670 do you think would be best to get, as I was planning on getting an EVGA FTW, seeing as it is basically a 680, but what do you think?


He asked if overclocked 670's could boost beyond _stock_ 680 levels. That is all.

Do you guys think those are 1080P or 1440P monitors in the Eyefinity 6 configuration on the right hand side of this image?:

http://chinese.vr-zone.com/wp-content/uploads/2012/06/marsiii_1.jpg

I am leaning towards 1440P seeing how small the text is on that center top monitor. That would mean the Matrix 7970 can do 4x DP and 2x Dual Link DVI at the same time which would be epic. Then again, those screens can be 1080P and I'm getting all excite for nothing.


----------



## kakee

Found here. ASUS Shows Off ROG Matrix 7970

Found close of end.
Quote:


> ...two dual-link DVI, and four DisplayPort connectors...


----------



## CallsignVega

Quote:


> Originally Posted by *kakee*
> 
> Found here. ASUS Shows Off ROG Matrix 7970
> Found close of end.
> Quote:
> 
> 
> 
> ...two dual-link DVI, and four DisplayPort connectors...
Click to expand...

Ya, but before that it says "could include". That isn't very reassuring lol.


----------



## kakee

Quote:


> Originally Posted by *CallsignVega*
> 
> Ya, but before that it says "could include". That isn't very reassuring lol.


Bad english







I forget using google this part.


----------



## De-Zant

Quote:


> Originally Posted by *CallsignVega*
> 
> Quote:
> 
> 
> 
> Originally Posted by *feteru*
> 
> Why do you think the 670s will clock higher? The smaller PCB, or better designs. So what 670 do you think would be best to get, as I was planning on getting an EVGA FTW, seeing as it is basically a 680, but what do you think?
> 
> 
> 
> He asked if overclocked 670's could boost beyond _stock_ 680 levels. That is all.
> 
> Do you guys think those are 1080P or 1440P monitors in the Eyefinity 6 configuration on the right hand side of this image?:
> 
> http://chinese.vr-zone.com/wp-content/uploads/2012/06/marsiii_1.jpg
> 
> I am leaning towards 1440P seeing how small the text is on that center top monitor. That would mean the Matrix 7970 can do 4x DP and 2x Dual Link DVI at the same time which would be epic. Then again, those screens can be 1080P and I'm getting all excite for nothing.
Click to expand...

Think the monitors say "Hanns-G". (check the one in the upper right corner) That would make them 1920x1080.

Also, the icon, taskbar, and app size also indicate that it's just 1920x1080.


----------



## nvidiaftw12

Def 1080.


----------



## JassimH

Is there anyway to get pcie 3.0 running off my 3930k in my RIVE board?

My gtx 680 DC2 top in SLI seems to have MUCH LESS performance according to PCIE 3.0 v 2.0 posts and I'd like to improve that without getting ivy bridge :\.


----------



## CallsignVega

Quote:


> Originally Posted by *De-Zant*
> 
> Think the monitors say "Hanns-G". (check the one in the upper right corner) That would make them 1920x1080.
> Also, the icon, taskbar, and app size also indicate that it's just 1920x1080.


Ya, must have been the angle of the photo that threw me off. The thought of a GPU with 2x Dual Link DVI and 4x DP get's me all giddy. But alas it just may not be...








Quote:


> Originally Posted by *JassimH*
> 
> Is there anyway to get pcie 3.0 running off my 3930k in my RIVE board?
> My gtx 680 DC2 top in SLI seems to have MUCH LESS performance according to PCIE 3.0 v 2.0 posts and I'd like to improve that without getting ivy bridge :\.


http://translate.google.com/translate?sl=auto&tl=en&js=n&prev=_t&hl=en&ie=UTF-8&layout=2&eotf=1&u=http%3A%2F%2Fwww.4gamer.net%2Fgames%2F022%2FG002210%2F20120323002%2F


----------



## nvidiaftw12

Quote:


> Originally Posted by *CallsignVega*
> 
> Ya, must have been the angle of the photo that threw me off. The thought of a GPU with 2x Dual Link DVI and 4x DP get's me all giddy. But alas it just may not be...


You might could ask the op of this thread/kip69 to look at the display model of the card without the monitors plugged in seen at 1:50 here.


----------



## CallsignVega

I just got a response from Asus saying that the 7970 Matrix has the same output limitations as the DCUII, so that's a bust.









Although, there is still hope that my 5x1 2B catleap project isn't dead in the water:





7970 X2 by HIS. I would bet 99% that the DVI connector is Dual Link seeing as there is only one of them in conjunction with the 4x mini-DP. Those would work out perfect. Toss two of those in my Sniper 3 for quad-fire 7970, and off I go. Although i highly doubt anyone would make a waterblock for such a specialized card. I might have to rig up something custom.


----------



## nvidiaftw12

Quote:


> Originally Posted by *CallsignVega*
> 
> I just got a response from Asus saying that the 7970 Matrix has the same output limitations as the DCUII, so that's a bust.
> 
> 
> 
> 
> 
> 
> 
> 
> Although, there is still hope that my 5x1 2B catleap project isn't dead in the water:
> 
> 
> 7970 X2 by HIS. I would bet 99% that the DVI connector is Dual Link seeing as there is only one of them in conjunction with the 4x mini-DP. Those would work out perfect. Toss two of those in my Sniper 3 for quad-fire 7970, and off I go. Although i highly doubt anyone would make a waterblock for such a specialized card. I might have to rig up something custom.


Saw that as well. Acctually like the looks of it. 80x better than that other 7970 x2.


----------



## CallsignVega

Quote:


> Originally Posted by *nvidiaftw12*
> 
> Saw that as well. Acctually like the looks of it. 80x better than that other 7970 x2.


I agree. Two large fans versus three small fans, and the HIS has far superior connectivity versus the Powercoler's Devil 13.


----------



## nvidiaftw12

Quote:


> Originally Posted by *CallsignVega*
> 
> I agree. Two large fans versus three small fans, and the HIS has far superior connectivity versus the Powercoler's Devil 13.


On the issue of connectivity. I don't understand why you couldn't just connect the monitors up two 2 7970/s (like 2 on one card and three on the other.) I mean that's all the 7970 x2 is isn't it? Two 7970s.


----------



## CallsignVega

Quote:


> Originally Posted by *nvidiaftw12*
> 
> On the issue of connectivity. I don't understand why you couldn't just connect the monitors up two 2 7970/s (like 2 on one card and three on the other.) I mean that's all the 7970 x2 is isn't it? Two 7970s.


Crossfire is kinda silly, you must connect *all* displays to the "master" card. All other GPU's send the primary or "master' GPU their AFR frames to render and output. This is one area where NVIDIA as a leg up IMO.


----------



## nvidiaftw12

Quote:


> Originally Posted by *CallsignVega*
> 
> Crossfire is kinda silly, you must connect *all* displays to the "master" card. All other GPU's send the primary or "master' GPU their AFR frames to render and output. This is one area where NVIDIA as a leg up IMO.


Huh. So on the 7970 x2 all those connectors are on one card and the other is just crossfired to it. So in all essence a regular 7970 could support the displays, amd just doesn't want to make it so.


----------



## axipher

Quote:


> Originally Posted by *nvidiaftw12*
> 
> Quote:
> 
> 
> 
> Originally Posted by *CallsignVega*
> 
> Crossfire is kinda silly, you must connect *all* displays to the "master" card. All other GPU's send the primary or "master' GPU their AFR frames to render and output. This is one area where NVIDIA as a leg up IMO.
> 
> 
> 
> Huh. So on the 7970 x2 all those connectors are on one card and the other is just crossfired to it. So in all essence a regular 7970 could support the displays, amd just doesn't want to make it so.
Click to expand...

All cards since the 5000 series support 6 screens if I'm not mistaken, it's just expensive to put that many connectors on cards so most vendors stop at 3 on the low-range, and 4 on the medium and high-range cards.


----------



## nvidiaftw12

Quote:


> Originally Posted by *axipher*
> 
> All cards since the 5000 series support 6 screens if I'm not mistaken, it's just expensive to put that many connectors on cards so most vendors stop at 3 on the low-range, and 4 on the medium and high-range cards.


There is enough connectors for 5/6 screens, just not enough for 1440p screens.


----------



## axipher

Quote:


> Originally Posted by *nvidiaftw12*
> 
> Quote:
> 
> 
> 
> Originally Posted by *axipher*
> 
> All cards since the 5000 series support 6 screens if I'm not mistaken, it's just expensive to put that many connectors on cards so most vendors stop at 3 on the low-range, and 4 on the medium and high-range cards.
> 
> 
> 
> There is enough connectors for 5/6 screens, just not enough for 1440p screens.
Click to expand...

Ah, I didn't catch that part, I though we were just talking about screen support, not HD+ screen support. My bad.


----------



## CallsignVega

Ya, this HIS X2 card is so special because it can run five 1440P+ screens or five 120Hz monitors. Something that no other 79xx series can. Can't wait for this bad boy!


----------



## nvidiaftw12

But can it run 5 1440p screens at 120hz is the question.


----------



## CallsignVega

Quote:


> Originally Posted by *nvidiaftw12*
> 
> But can it run 5 1440p screens at 120hz is the question.


Technically it can. But all 1440P overclockable monitors are DL-DVI right now. So adapters must be used (in my case four of them). The Accell 330 MHz pixel clock DP to DL-DVI adapters that I use can be overclocked to the ~415 MHz range. That allows a refresh rate of around 107 Hz on a overclockable 1440P monitor. 107 Hz/FPS is still pretty darn sweet!


----------



## nvidiaftw12

Quote:


> Originally Posted by *CallsignVega*
> 
> Technically it can. But all 1440P overclockable monitors are DL-DVI right now. So adapters must be used (in my case four of them). The Accell 330 MHz pixel clock DP to DL-DVI adapters that I use can be overclocked to the ~415 MHz range. That allows a refresh rate of around 107 Hz on a overclockable 1440P monitor. 107 Hz/FPS is still pretty darn sweet!


Find better adapters. Ones that overclock to allow 120 hz.


----------



## Citra

Quote:


> Originally Posted by *nvidiaftw12*
> 
> Find better adapters. Ones that overclock to allow 120 hz.


Vega only gets the best. If that's what he is going to use, it's probably already the best.Lol.


----------



## CallsignVega

There is no such thing. These Adapter's already over clock well beyond the 2x 165 MHz specification for DL-DVI.


----------



## nvidiaftw12

Well that sucks. Only a measly 107 hz.


----------



## Billy_5110

GTX 680 classified is now ''announced''

They probably saw you're topic and said to themselves: this guy need our help... overpriced gpu with insave tweaking features and phases incommmiiiinnnnngggg!!!!!!!!!!!!!


----------



## zdude

Okay Vega question for you, when I upgrade to the GTX 680 will the classified HC provide any benefit for me, at first I will be only gaming on a single screen so the 4GB VRAM is useless but I may go to a triple 1440p setup at some point, but what i am asking is will the extra VRM's provide much benefit to the clock speeds (noticed you didn't get the 580 classified that's why I asked).


----------



## CallsignVega

Actually, EVGA have stated the memory over clock limit may be reduced just a bit as larger amounts of memory overclock worse. 4GB of VRAM is useless single screen and is only useful for three or more screens 1400p+ or greater resolution. And if you go three 1440P you will need 2+ classified to even run them properly.


----------



## stren

Quote:


> Originally Posted by *CallsignVega*
> 
> Actually, EVGA have stated the memory over clock limit may be reduced just a bit as larger amounts of memory overclock worse. 4GB of VRAM is useless single screen and is only useful for three or more screens 1400p+ or greater resolution. And if you go three 1440P you will need 2+ classified to even run them properly.


Hey Vega - any word on the release of the asrock extreme 11 board? Is that still your plan? Has anyone had problems with the PLX lane splitter vs latency (or other) problems?


----------



## TA4K

Isnt it better to have more PCIE lanes in CFX so the cards can talk to each other as well as thru the CFX bridge? That would make the extreme11 more reasonable, yes?


----------



## CallsignVega

Extreme11 is still late July. It wouldn't make sense to get it if I get two X2 7970's for quad-fire seeing as I already have two 16x speed slots with the Sniper 3. I was going to go with the Extreme11 if an "Eyefinity 6" type 7970 came out so I could get four of those. But that hasn't been the case. I will still hold onto my 3960X until the dust settles just to make sure.

If for some reason my 5x 2B catleap is a bust with the X2's, I still may go with three monitors and 4x 680 Classified's, so there is another case I might get the Extreme11. Plus that board is just full of win so I might get it anyway.









I think two X2's going through one PLX (Sniper 3) might run a few percentage points faster than two X2's running through two PLX's (Extreme11), well unless I run the X2's in 16x slots #1 and #2 (with water cooling and cutting the bracket of course as X2's are 3-slot) instead of the traditional #1 and #3 that have their own PLX tied to them.

In the end state it most likely doesn't matter though as whether I keep Sniper 3 or go Extreme11 both X2's will have 16x speed slots.


----------



## Samurai Batgirl

You're not going to try for the HIS 7970 IceQ X2?


----------



## CallsignVega

Quote:


> Originally Posted by *Samurai Batgirl*
> 
> You're not going to try for the HIS 7970 IceQ X2?


I am. I have to get this 5x1 2B Catleap setup out of my system. Something about 18.4 million pixels running at 100+ Hz/FPS with great viewing angles draws me in like a moth to a flame.









http://www.guru3d.com/news/three-new-radeon-hd-7900-cards-coming/


----------



## Samurai Batgirl

I'm waiting to see it, too


----------



## zdude

Quote:


> Originally Posted by *CallsignVega*
> 
> I am. I have to get this 5x1 2B Catleap setup out of my system. Something about 18.4 million pixels running at 100+ Hz/FPS with great viewing angles draws me in like a moth to a flame.
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.guru3d.com/news/three-new-radeon-hd-7900-cards-coming/


Okay its official we are all addicts to a perfectly legal substance, Vega is the only one with the funds to satisfy his addiction though.


----------



## okonk

Hey Vega,

Just a thought/question about mixing 4x DP and 1x DL-DVI. My current setup is 3x 1080p 60Hz monitors running from 2x reference Sapphire 7970s (using DP->DVI, HDMI, DVI), and with Eyefinity enabled I get tearing on my DP->DVI display.

I believe this is due to the display port running on a different clock source on the card to the HDMI/DVI ports. So I can either have my display port running synced and tearing on my HDMI/DVI ports, or how I have it currently with just the DP->DVI out of sync.

Do you know anything about this problem/have you considered it for your setup? Or are these new cards going to do something different so you won't get any tearing?

I'm also waiting for 6-DP 7970s or maybe even a 6-DP 7990 to fix my tearing problem. I'm also considering getting 3 more screens just for fun, which I need more ports on the card for that unless I get some DP splitters.


----------



## ganganputput

Hi vega, finally placed orders for the evga 670 SC 4GB. got 4 of em, it takes about 2-3 weeks for them to ship to Singapore.

next step would be to wait for my next months wages, then i can get a sniper and a 3770k.

Slow but steady ill get a souped up pc







good to know that u managed to find the 7970 his x2 that may fit your needs









what ram should i get for the pc considering that i want to oc the 3770k to the max speed possible but stable 24/7? under water (no chilled)
which ram speed and brand do i need?

if needed i might go the koolance ramblock way







means that the ram that i get need to have an easily removable cooler?

edit: yes ill operate the 3770 to replace the tim inside..


----------



## CallsignVega

Quote:


> Originally Posted by *okonk*
> 
> Hey Vega,
> Just a thought/question about mixing 4x DP and 1x DL-DVI. My current setup is 3x 1080p 60Hz monitors running from 2x reference Sapphire 7970s (using DP->DVI, HDMI, DVI), and with Eyefinity enabled I get tearing on my DP->DVI display.
> I believe this is due to the display port running on a different clock source on the card to the HDMI/DVI ports. So I can either have my display port running synced and tearing on my HDMI/DVI ports, or how I have it currently with just the DP->DVI out of sync.
> Do you know anything about this problem/have you considered it for your setup? Or are these new cards going to do something different so you won't get any tearing?
> I'm also waiting for 6-DP 7970s or maybe even a 6-DP 7990 to fix my tearing problem. I'm also considering getting 3 more screens just for fun, which I need more ports on the card for that unless I get some DP splitters.


Yes, the clock Sync problem has been a problem since Eyefinity came out. According to the latest patch notes, the new drivers have fixed this problem. Have you tried these drivers:

http://forums.guru3d.com/showthread.php?t=364400

Try those and please report back if the tearing is fixed.
Quote:


> Originally Posted by *ganganputput*
> 
> Hi vega, finally placed orders for the evga 670 SC 4GB. got 4 of em, it takes about 2-3 weeks for them to ship to Singapore.
> next step would be to wait for my next months wages, then i can get a sniper and a 3770k.
> Slow but steady ill get a souped up pc
> 
> 
> 
> 
> 
> 
> 
> good to know that u managed to find the 7970 his x2 that may fit your needs
> 
> 
> 
> 
> 
> 
> 
> 
> what ram should i get for the pc considering that i want to oc the 3770k to the max speed possible but stable 24/7? under water (no chilled)
> which ram speed and brand do i need?
> if needed i might go the koolance ramblock way
> 
> 
> 
> 
> 
> 
> 
> means that the ram that i get need to have an easily removable cooler?
> edit: yes ill operate the 3770 to replace the tim inside..


Cool. RAM speed doesn't make a huge difference in games, but I still get fairly fast ones. I use 16GB so I don't have to run a page file. My current ones are 2400 MHz at 9-11-11-28 Team Group:

http://www.newegg.com/Product/Product.aspx?Item=N82E16820313243


----------



## okonk

Quote:


> Originally Posted by *CallsignVega*
> 
> Yes, the clock Sync problem has been a problem since Eyefinity came out. According to the latest patch notes, the new drivers have fixed this problem. Have you tried these drivers:
> http://forums.guru3d.com/showthread.php?t=364400
> Try those and please report back if the tearing is fixed.


One step forward, one step back. I don't think it's tearing anymore, but now it's detecting my hdmi screen as an hdtv and screwing up the display with black borders around it like it's scaling, but scaling is disabled... Might try to fix it, but if I can't will just revert and wait until these drivers are officially released. Thanks.

That card with 4x DP and 2x DVI is tempting now.


----------



## Bigbrag

Quote:


> Originally Posted by *okonk*
> 
> One step forward, one step back. I don't think it's tearing anymore, but now it's detecting my hdmi screen as an hdtv and screwing up the display with black borders around it like it's scaling, but scaling is disabled... Might try to fix it, but if I can't will just revert and wait until these drivers are officially released. Thanks.
> That card with 4x DP and 2x DVI is tempting now.


You have to use the overscan option in the catylist control center. I have had the same issue several times. I'm currently running 3 catleaps with 2 mini dp to dual dvi converters.


----------



## CallsignVega

Now I know why the 7990, X2 card's, Extreme11 are all taking forever to come out. What do they all have in common? PLX chips:

http://www.techpowerup.com/167683/Shortage-of-PEX8747-Bridge-Chip-Disturbs-Various-Launch-Schedules.html


----------



## CallsignVega

Going to start disassembling the 2B Catleaps this weekend to mount on my new 5x1 portrait monitor stand. I will have to find some way to secure the back of the panels to the VESA mounts which should be interesting. I will run three panels on 2x 680's until the 7990 and/or HIS X2 comes out.

I will take some performance numbers from some games on single Catleap and then performance numbers on the 3x1 portrait Catleap setup. You guys think the performance will be exactly 33% that of the single screen? Higher/lower and why?


----------



## Quest99

Which stand are you using? or is it custom made?


----------



## CallsignVega

Quote:


> Originally Posted by *Quest99*
> 
> Which stand are you using? or is it custom made?


http://www.wsgf.org/products/wsgf-edition-monitor-stand

They are just coming back in stock.


----------



## Bigbrag

I'm looking forward to how you are going to mount these 2b's.It will help me decide whether or not to debezel mine. That stand looks great as well. I love the base for it. I may get the triple one. The link you posted for your stand says that 5x1 does not work with 27" and 30" screens? I'm sure you checked the measurements. I just thought it was odd.


----------



## CallsignVega

Quote:


> Originally Posted by *Bigbrag*
> 
> I'm looking forward to how you are going to mount these 2b's.It will help me decide whether or not to debezel mine. That stand looks great as well. I love the base for it. I may get the triple one. The link you posted for your stand says that 5x1 does not work with 27" and 30" screens? I'm sure you checked the measurements. I just thought it was odd.


It won't work with the bezels on the monitors. They will fit after I give them the Vega treatment.









Here is a video of the stand:

http://www.youtube.com/watch?v=e3fPtLXBMCM

The thing I like about this stand is it allows for a lot of adjustability, something that is pretty important for large multi-monitor setups.


----------



## zdude

my vote is 40% preformance in surround VS single...


----------



## stren

SO how's the debezelling going?


----------



## CallsignVega

Quote:


> Originally Posted by *stren*
> 
> SO how's the debezelling going?


Good, working on everything now. Have some paint drying on some mounts so should have some pictures up soon. Catleap's don't have any sort of mounting options for the bare panels, so custom was required. Simply had to spread the load out over the back of the panel, instead of a small area as on a VESA mount. These panels are heavy! Easily weight three times the weight of the 23" 120Hz Samsung's I've used in the past.

The 5x WSGF mount/stand is just awesome. More to follow on that.


----------



## ganganputput

Quote:


> Originally Posted by *CallsignVega*
> 
> Going to start disassembling the 2B Catleaps this weekend to mount on my new 5x1 portrait monitor stand. I will have to find some way to secure the back of the panels to the VESA mounts which should be interesting. I will run three panels on 2x 680's until the 7990 and/or HIS X2 comes out.
> I will take some performance numbers from some games on single Catleap and then performance numbers on the 3x1 portrait Catleap setup. You guys think the performance will be exactly 33% that of the single screen? Higher/lower and why?


im guessing that its not gonna be exactly 33%. it will be higher.
Why?

i had 2 680 2gb in SLI pushing 3x 2560x1440 monitors (samsung sa850d) and got like 60 fps avg in bf3 with one monitor on ultra no msaa, but with three monitors, got about 25-35 fps. this are from Fraps and visual monitoring, didnt do any perfect benchmarking like heaven and stuff.
i sold off one 680 for in a financial lull period thereafter, and due to the 2gb memory topping out with full settings. waiting for my quad 670 4gb sc now.


----------



## ganganputput

how thin have you gotten the bezels on each 2B? about 10mm?


----------



## CallsignVega

The smallest I can make the bezel gap is 12.7mm (right at one-half inch). While that is almost double the bezel gap of the 23" 120Hz Samsung's (I got those down to ~1/4"), the Catleap's are much larger panels so when I have five of them setup, the bezel gap to screen-real estate size will still be pretty good. I am already loving the IPS viewing angles in portrait versus the TN panels.

Fail-bucket is down at the moment, so I can't upload any pictures just yet. (too lazy to go somewhere else LOL). You guy's will like the photo's







. This setup will just be insane when finished. I hope 4-way 7970's will be able to do it justice.


----------



## CallsignVega

"It was one of the scariest moments of my life, but it had to be done. It takes some guts to take apart a 19" LCD screen that cost $120."

LOL, a read for a good laugh: http://blog.fieryferret.com/labels/display.html


----------



## CallsignVega

Did some more extensive testing. With the monitor running at 60 Hz, the center of the ribbon cable operates between 130 to 175 F. So it looks like they are designed to get pretty warm. This is with a 71 F ambient temp. With the monitor overclocked to 120 Hz, the ribbon cables operate at 150 to 245 F. So a nice increase there.

Temp readings are in the center of the cable where the electric traces converge to the middle where the red arrow is:










Is anyone an electrical engineer or know of one to see if this is detrimental or not? Who knows, maybe this special type of ribbon cable is perfectly fine at these temperatures. Just don't accidentally touch one like I did while adjusting my de-bezeled monitor as they burn!

As for the rest of the electronics, everything seems A-OK. The display controller chip, 220V chokes, capacitors, LED bar, everything else stays 110 F or lower at 60 Hz. Testing the same components at 120 Hz everything stays below 120 F (in my open air setup). That isn't much of a rise so no worries there. It's those cables that show an appreciable rise in temp. Since I haven't heard of any failing and the temp's level off and I see no damage to the ribbon cable, I am sure it will be fine. Just caught me off guard that a cable could get that hot!

Some more pics:

This is one bad mojo stand (I love it).














































Time to set up the de-bezel assembly line for the rest.


----------



## Nocturin

My name is Seth McFarlane, and I approve of this build.


----------



## stren

Nice thanks for the update vega - so the monitor leans against the stand and sits on a neck pillow? >> edit - I see that's not a neck pillow it's the base of the stand lol.

I'm an EE, though I deal more with chip design than board and cables. That sounds very hot for a cable, hard to imagine it's the cable causing that unless those are powers that are drastically undersized and resistive, maybe the wires are effectively heatsinking heat away from something else? It's also hard to imagine that removing the cover would make that temperature rise unless there was some kind of heatsink or AL on there. For me the major concern of hot interconnect is electromigration which increases exponentially with temperatures and will decrease the life - we size our interconnect large for a 10 year life at 110C (as much as is physically possible - sometimes you just can't make it big enough). However our interconnect is a *lot* smaller than real wires where EM is never usually a concern AFAIK.


----------



## Levesque

Hummm. Strange. Going from some great 3X ZR30W displays to some cheapo-random-luck-to-get-a-good-one Catleap displays.

Something is not computing here. Not logical. ... You went away from an awesome set-up with 3X 30'' ZR30W because you were ''not satisfied'' with those displays in portrait (too big, headaches, etc blah blah blah I don't remember well why you went away from that set-up then), and a couple of months later, strangely, you go back to the exact same set-up, but with 10X cheaper displays...

So the 3X30'' were not that terribad then? Or were you simply out of budget then?









I think your fanclub deserves an explanation.


----------



## stren

Quote:


> Originally Posted by *Levesque*
> 
> Hummm. Strange. Going from some great 3X ZR30W displays to some cheapo-random-luck-to-get-a-good-one Catleap displays.
> Something is not computing here. Not logical. ... You went away from an awesome set-up with 3X 30'' ZR30W because you were ''not satisfied'' with those displays in portrait (too big, headaches, etc blah blah blah I don't remember well why you went away from that set-up then), and a couple of months later, strangely, you go back to the exact same set-up, but with 10X cheaper displays...
> So the 3X30'' were not that terribad then? Or were you simply out of budget then?
> 
> 
> 
> 
> 
> 
> 
> 
> I think your fanclub deserves an explanation.


Silly boy these are 120hz 1440p monitors. They may be "off-brand" but they are extremely good. They are also smaller than the 30 inchers (27) while increasing pixel density.

Vega went from 3x 30" to 5 x 120Hz 1080p (for higher refresh) to 3x FW900 (for reduced lag and high refresh) to 1xFW900 (for minimal lag for FPS) to 1xFW900 for fps and 5xCatleap 120Hz 1440p for immersive gaming.


----------



## Blizlake

Quote:


> Originally Posted by *stren*
> 
> Vega went from 3x 30" to 5 x 120Hz 1080p (for higher refresh) to 3x FW900 (for reduced lag and high refresh) to 1xFW900 (for minimal lag for FPS) to 1xFW900 for fps and 5xCatleap 120Hz 1440p for immersive gaming.


I think I feel a bit dizzy now...









Oh btw, me and my limited knowledge agree with stren's previous post about the temps.


----------



## CallsignVega

Quote:


> Originally Posted by *Levesque*
> 
> Hummm. Strange. Going from some great 3X ZR30W displays to some cheapo-random-luck-to-get-a-good-one Catleap displays.
> Something is not computing here. Not logical. ... You went away from an awesome set-up with 3X 30'' ZR30W because you were ''not satisfied'' with those displays in portrait (too big, headaches, etc blah blah blah I don't remember well why you went away from that set-up then), and a couple of months later, strangely, you go back to the exact same set-up, but with 10X cheaper displays...
> So the 3X30'' were not that terribad then? Or were you simply out of budget then?
> 
> 
> 
> 
> 
> 
> 
> 
> I think your fanclub deserves an explanation.


It's been a while since a resident troll has visited my thread.







Who cares who assembled the Catleap's? They use the same LG panel as the other 1440P monitors, except with better (over-clockable) electronics. Only noob's play games at 60 Hz. 3x 30" horrid sparkle matte coating monitors was my "starter" setup man, try and keep up. When you want to join the big boys at 18.4 million pixels at 120 Hz let me know. I can send you a picture in the mail.








Quote:


> Originally Posted by *stren*
> 
> Nice thanks for the update vega - so the monitor leans against the stand and sits on a neck pillow? >> edit - I see that's not a neck pillow it's the base of the stand lol.
> I'm an EE, though I deal more with chip design than board and cables. That sounds very hot for a cable, hard to imagine it's the cable causing that unless those are powers that are drastically undersized and resistive, maybe the wires are effectively heatsinking heat away from something else? It's also hard to imagine that removing the cover would make that temperature rise unless there was some kind of heatsink or AL on there. For me the major concern of hot interconnect is electromigration which increases exponentially with temperatures and will decrease the life - we size our interconnect large for a 10 year life at 110C (as much as is physically possible - sometimes you just can't make it big enough). However our interconnect is a *lot* smaller than real wires where EM is never usually a concern AFAIK.


It's the actual wires in the cable generating the heat. If I turn the monitor off, I can see the temp drop off sharply within a few seconds. It's simply the amount of bandwidth that has to pass through the cable has sharply increased. There are no heat-sinks nor fans associated with this monitor. I would think that the monitor being in open air versus in the case would make it even cooler.


----------



## renat77

CallsignVega

I want you to experience your help on a subject, and you will benefit. Please help me.

At first some information about my system:

Motherboard: ASROCK X79 fata1ity CHAMPIONS
Processor: Intel Core i7 [email protected] 4.8ghz
RAM: 16gb gskill @ 1600mhz
GPU: 3x Gigabyte Windforce GTX 670 oc
OS: Windows 7(x64)
Game settings: BF3 ULTRA. 1920*1080
Monitor: Samsung 120Hz. S27A950D
GeForce 301.42 Driver WHQL
Graphic Cards Temp.<70C

AMD card, 7970 3-way CF 150FPS average score. 110FPS average score of 3-way SLI in my gtx670. The third card if it does not exist! GPU operating at the same low. 60% - 80%

BF3 Multiplayer 64P.mod and Caspian Boarder & GulF of Omman maps.(Specially big maps!) Average score was 80-85FPS GTX570SLI my old cards. For 3-way GTX670SLI, 100-110fps too poor score. Values should be higher. For BF3, 3-way SLI scaling problem there? Solve the problem with the new nVidia driver you think? Because of this scaling, AMD cards business more successful. That cards score min.120 FPS.

CallsignVega, I watched your youtube videos. 7970 has the best performance of your cards.
Well, I now what do I do?

The same problem also stands out GURU3D site investigations. 3-way SLI and Quad SLI GTX680 is clear that the problem is. In fact, even the GTX690 also suffers from the same problem.

Here is the link:
http://www.guru3d.com/article/geforce-gtx-670-2-and-3way-sli-review/15
http://www.guru3d.com/article/geforce-gtx-680-3way-sli-review/14

Some other games had the same problem. BF3 kind of important thing for me. SLI gtx570 card just sold for that game. I bought 3 pieces GTX670. I'm disappointed right now.

In the meantime, much like handling systems 3x120Hz Portrait Eyefinity. Then I'll follow you always









I apologize for my bad English
Best regards ..


----------



## renat77

The problem is not support by Nvidia PCI-E 3.0?

I tried to enable the method to the motherboard PCI-E 3.0 X79. But the two could make my card. Third Win7 did not recognize my card! boot win7 and was very difficult. I now went back to the old settings.

I hope the new driver for nvidia `s 3.0 offers support to us.


----------



## CallsignVega

I have not seen any scaling issues with my GTX 680s in BF3. What kind of display configuration are you using? If I had to choose between 3x 7970's or 3x 670's, I would go with the 7970's.

That guru3d review is showing classic CPU limitation. At 1920x1200 2 way SLI is just about as fast as 3 way SLI. When the resolution is bumped to 2560x1600, the 3-way SLI starts to pull ahead of the 2-way SLI quite a bit.



You don't need more than two 7970's or two 680's for a single monitor IMO. Now, when running three+ screens, I've always found three and four GPU's scale really well. People always say "this and that GPU scales like crap in 3-4 way GPU". They don't understand that virtually all of those tests are on single monitor and are CPU limited, not "GPU scaling limited".

As for the X79 PCI-E 3.0, NVIDIA have just stated that they will not officially support PCI-E 3.0 for the 6xx series on X79. BUT, they said they will leave the option open to run PCI-E 3.0 on X79 at "user risk". As long as they keep the option available I am perfectly happy with that solution.

http://forums.nvidia.com/index.php?showtopic=232153

Now someone please tell me how to get rid of the smell from Rustoleum oil based paint! I painted the monitor mounts with that crap and here four days later I am still getting high from it. Must be the most toxic paint ever lol.


----------



## renat77

Quote:


> Originally Posted by *CallsignVega*
> 
> I have not seen any scaling issues with my GTX 680s in BF3. What kind of display configuration are you using? If I had to choose between 3x 7970's or 3x 670's, I would go with the 7970's.
> That guru3d review is showing classic CPU limitation. At 1920x1200 2 way SLI is just about as fast as 3 way SLI. When the resolution is bumped to 2560x1600, the 3-way SLI starts to pull ahead of the 2-way SLI quite a bit.
> 
> You don't need more than two 7970's or two 680's for a single monitor IMO. Now, when running three+ screens, I've always found three and four GPU's scale really well. People always say "this and that GPU scales like crap in 3-4 way GPU". They don't understand that virtually all of those tests are on single monitor and are CPU limited, not "GPU scaling limited".
> As for the X79 PCI-E 3.0, NVIDIA have just stated that they will not officially support PCI-E 3.0 for the 6xx series on X79. BUT, they said they will leave the option open to run PCI-E 3.0 on X79 at "user risk". As long as they keep the option available I am perfectly happy with that solution.
> http://forums.nvidia.com/index.php?showtopic=232153
> Now someone please tell me how to get rid of the smell from Rustoleum oil based paint! I painted the monitor mounts with that crap and here four days later I am still getting high from it. Must be the most toxic paint ever lol.


*First of all, thank you for writing answers.

My Game settings: BF3 ULTRA. 1920*1080 and Samsung 120Hz. S27A950D

My 3 cards performance, the same with the video link given below.






Your 7970CF minimum 120fps performance. Although Eyefinity Portrait Setup to use 3x 120Hz.






Here's what I do not understand it!

Please let's make this problem solved. So sick can not find the answer.You have experienced. I trust that.

Did I put 3 x GTX670 cards on sale? 3 x 7970 instead of the new cards should I buy it?

*


----------



## CallsignVega

3x 670's is overkill for 1080P. The GPU's won't be maxed out, you will be CPU limited. What CPU and speed do you run?

In my video, I am running a 2700K @ 5.2 GHz. That pushes a lot of info to the 7970's which are also overclocked to 1200 MHz. Plus not everything is set to Ultra in BF3.

You can see in that first video you posted his 4.4 GHz 2600K is much to slow to feed even two GTX 680's, let alone three cards.


----------



## thecrim

Awesome thread Vega! can't wait to see what else you come up with.

Is a 3820 @ 4.8GHz sufficient for 2 7970s?


----------



## renat77

Quote:


> Originally Posted by *CallsignVega*
> 
> 3x 670's is overkill for 1080P. The GPU's won't be maxed out, you will be CPU limited. What CPU and speed do you run?
> In my video, I am running a 2700K @ 5.2 GHz. That pushes a lot of info to the 7970's which are also overclocked to 1200 MHz. Plus not everything is set to Ultra in BF3.
> You can see in that first video you posted his 4.4 GHz 2600K is much to slow to feed even two GTX 680's, let alone three cards.


*My CPU run at 4.8Ghz [email protected]

What are your settings? dropped what settings. MESH samplig quality settings with 4xAA multi has effect after the performance the most.

Is it overkill for the 3x7970CF in 1920 * 1080 resolution?

Finally, I want to ask.
1- Why does not the bottleneck in the CPU 3x Crossfire 7970? 150FPS
2-There are 680 3x SLI bottleneck Why? 128FPS LOL

This situation originates in the lack of support for the Nvidia driver. Or nvidia, because it does not like AMD PCIE 3.0 support. I do not know.





3-BF3 singleplayer mode, using 3 GPU fully (%98).






Why multiplayer mode BF3 60-80% GPU usage?





Whether you want the GTX 600 series, 7970 series GPU SLI mode to use get (1920 * 1080 resolution), 98% for singleplayer, multiplayer mode, the range of 60-80%

Would you say the answer to these?*


----------



## jetpak12

Quote:


> Originally Posted by *CallsignVega*
> 
> Now someone please tell me how to get rid of the smell from Rustoleum oil based paint! I painted the monitor mounts with that crap and here four days later I am still getting high from it. Must be the most toxic paint ever lol.


Lol, my dad uses Rustoleum for everything, but its always for outdoor projects, so I've never noticed the smell. Just make sure to get plenty of ventilation in there I guess.









Btw, awesome build, I enjoy watching your projects and seeing you push components to their limits!


----------



## rdrdrdrd

That paint is for outdoors vega


----------



## CallsignVega

Quote:


> Originally Posted by *thecrim*
> 
> Awesome thread Vega! can't wait to see what else you come up with.
> Is a 3820 @ 4.8GHz sufficient for 2 7970s?


Ya, that's a nice speed. Plus having two 7970's put them in their own 16x 3.0 slots and they should fly.








Quote:


> Originally Posted by *renat77*
> 
> *My CPU run at 4.8Ghz [email protected]
> What are your settings? dropped what settings. MESH samplig quality settings with 4xAA multi has effect after the performance the most.
> Is it overkill for the 3x7970CF in 1920 * 1080 resolution?
> Finally, I want to ask.
> 1- Why does not the bottleneck in the CPU 3x Crossfire 7970? 150FPS
> 2-There are 680 3x SLI bottleneck Why? 128FPS LOL
> This situation originates in the lack of support for the Nvidia driver. Or nvidia, because it does not like AMD PCIE 3.0 support. I do not know.
> 
> 3-BF3 singleplayer mode, using 3 GPU fully (%98).
> 
> Why multiplayer mode BF3 60-80% GPU usage?
> 
> Whether you want the GTX 600 series, 7970 series GPU SLI mode to use get (1920 * 1080 resolution), 98% for singleplayer, multiplayer mode, the range of 60-80%
> Would you say the answer to these?*


Yes, mesh sampling will strain the CPU much more than the GPU. I turn mesh down to low. 4x AA should be fine as that stresses the GPU's and doesn't really affect the CPU. You have to remember there are a lot of variables in those charts. If I remember correctly, those are really old benchmarks with early drivers. Not to mention, was it single player? Multi-player? Multi-player stresses the CPU a lot more than single player! This will lower GPU utilization.

I'll be selling my 680's and going with GHz edition X2 7970's or 7990's. Higher refresh rates on the Catleap's in multi-GPU mode and the drivers have really improved A LOT this month. I88Bastard has 2x 7970's with Eyefinity and he says he's never seen such smooth performance before the new drivers. Not to mention 79xx series scales better than the 6xx series the higher the resolution.

Not to mention AMD does it right on X79 with PCI-E 3.0. NVIDIA have gone a bit ******ed in that regard, although I've never had any issues running 680's at 3.0 on my X79 with the reg edit. If I were you I'd get rid of the third 670, drop down to two 670's having their own 16x slot and enable 3.0 PCI-E if you haven't. If you want 7970's like I do, I'd get two GHz editions coming out here soon and drop them in the X79 16x slots and you will be golden! I bet you will get great performance at 1080P with two of those cards.

Quote:


> Originally Posted by *jetpak12*
> 
> Lol, my dad uses Rustoleum for everything, but its always for outdoor projects, so I've never noticed the smell. Just make sure to get plenty of ventilation in there I guess.
> 
> 
> 
> 
> 
> 
> 
> 
> Btw, awesome build, I enjoy watching your projects and seeing you push components to their limits!


Quote:


> Originally Posted by *rdrdrdrd*
> 
> That paint is for outdoors vega


Ya, even though the can doesn't say anything like "outdoor only", I wish it would mention about the smell. Four days after being dry and still stinks to high heaven, that seems a bit overboard IMO. I didn't want to go with latex paint as that isn't durable. The back of the Rustoleum can even says "its safe for children's toys"! Ya, if you want to get them high.


----------



## The viking

@ Vega!

I've just finished reading through this entire thread. Words can not describe how big of an e-penis you have earned yourself.
Since you seem to be THE man to ask about gfx cards, I feel like I have to do it in this thread, in order to get the best answer possible.

I'm debating with myself, on what kind of a monitor I want to get next. I just moved from 3x24" monitors in an eyefinity setup. I want to
go bigger, but only use one monitor for now. I've read tons about the korean monitors, and I think the Achieva Shimian monitor is just the one.
But, with this bit of a bigger display, I think my system needs a few new components. Can you tell me which i should switch out, or maybe which
components I should buy, if I have to buy an entire new system?
Current parts:
CPU: i7 930 D0 - running somewhere around 4.2ghz
RAM:Kingston HyperX 6gb kit, 2000mhz
MOBO:EVGA X58 SLI LE
GPU:Ati 6950 - this is the component even I know I have to change
HDD: Intel X25-M 80gb version
PSU: Corsair 1000W

With the new system, I want to be able to play BF3,GTA4/5, AC3 and a few other games, on the highest res possible, with most filters on.
Will a "simple" 680 suffice?
Your opinion would be grately appreciated.


----------



## rdrdrdrd

Quote:


> Originally Posted by *CallsignVega*
> 
> Ya, even though the can doesn't say anything like "outdoor only", I wish it would mention about the smell. Four days after being dry and still stinks to high heaven, that seems a bit overboard IMO. I didn't want to go with latex paint as that isn't durable. The back of the Rustoleum can even says "its safe for children's toys"! Ya, if you want to get them high.


LOL, we use it to paint fences and furniture at work, it takes about a week for the smell to become bearable even when outside.


----------



## renat77

@CallsignVega

Thank you for answered my questions.

Just before I sold one of the cards. For now, continue with 2 cards. I guess we wait emergence 7970GHz edition will be better. BF3 with 3x CF 7970Ghz Edition ultra setting at 1920*1080 I intend to 150FPS. Do you think I can hold this goal?


----------



## CallsignVega

Just a quick update, three 2B Catleap's de-bezeled portrait at 102 Hz:










Loving the gloss IPS viewing angles and colors in portrait! Time to test these bad boys out in some games.


----------



## nvidiaftw12

I wonder how easy it would be to rob vega?









Jk, but those do look slightly dis-aligned from this angle.


----------



## TheBadBull

wohw my gawd.


----------



## l88bastar

Well done Vega those look pretty Epic in Portrait. I noticed you went with the step technique, which has worked out well.

On another note. I was wondering if you could help me out with a computer problem.

I am running an intel p4 processor and ATI Radeon 9700pro with a really HUGEM BIGGEM 60" 720p LCD and I was thinking about buying two more HUGEM 60" 720p tvs for a portrait setup like you have. My first question is, is one 9700pro enough or should I get another one for crossfire and my second question is do you think I should go with the three 60" 720p LCDs or should I upgrade to three FULL HD LCD HDTVs? And if I do upgrade to FULL HD TVs, should I then get 4 9700pros to quadfire and if so can you recommend me a motherboard that can accommodate 4 9700pros...oh and also can you recommend me a PSU that could push four 9700pros?

Thanks man...I appreciate your future help in the future at another time from now, but not right now cause thats impossible because I know you need to take the time to evaluate and then get back to me later on at another time that is not right now but in the future at a non present time.


----------



## stren

I want to go there!


----------



## kakee

That looks brilliant, I can only imagine what it feels like 102 hz at that resolution.

I get my portrait 3x1 setup ready next month with non-OC Catleap monitors














2B extreme models of the second installment takes a little longer than expected.


----------



## thecrim

Quote:


> Originally Posted by *kakee*
> 
> That looks brilliant, I can only imagine what it feels like 102 hz at that resolution.
> I get my portrait 3x1 setup ready next month with non-OC Catleap monitors
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 2B extreme models of the second installment takes a little longer than expected.


Just saw the update too







i ordered the Accell adapter and the 24AWG dl-dvi cable though, i needed something to last me until they were released.


----------



## CallsignVega

Quote:


> Originally Posted by *l88bastar*
> 
> Well done Vega those look pretty Epic in Portrait. I noticed you went with the step technique, which has worked out well.
> On another note. I was wondering if you could help me out with a computer problem.
> I am running an intel p4 processor and ATI Radeon 9700pro with a really HUGEM BIGGEM 60" 720p LCD and I was thinking about buying two more HUGEM 60" 720p tvs for a portrait setup like you have. My first question is, is one 9700pro enough or should I get another one for crossfire and my second question is do you think I should go with the three 60" 720p LCDs or should I upgrade to three FULL HD LCD HDTVs? And if I do upgrade to FULL HD TVs, should I then get 4 9700pros to quadfire and if so can you recommend me a motherboard that can accommodate 4 9700pros...oh and also can you recommend me a PSU that could push four 9700pros?
> Thanks man...I appreciate your future help in the future at another time from now, but not right now cause thats impossible because I know you need to take the time to evaluate and then get back to me later on at another time that is not right now but in the future at a non present time.


I would personally go with withe Voodoo 5 and a pair of Carmack goggles:


----------



## Citra




----------



## CallsignVega

Two GTX 680's are just awful trying to run these three Catleap's. Ton's of stuttering and slow-downs in BF3. I have two MSI Lightning 7970's inbound to see if they fair better. If two of those can't run three of these screens, I don't think two X2 7970's will run five catleap's any better and I might have to re-evaluate my display setup. Pushing this many pixels at this high of a refresh rate is seriously demanding.


----------



## thecrim

I'm currently using one Lightning and I'm sure they won't disappoint you!


----------



## zdude

X79?


----------



## iCrap

Vega where are you finding these awesome desktop backgrounds large enough to cover all your screens? I have such a hard time finding pictures for mine..


----------



## armartins

It's not really a surprise 2 680s not running well enough (even MORE considering your standards).You're almost with the same amount of pixels that you had with your 3x30" and 4 way SLI'd 3GB 580s and that had more raw power than 2 680s for sure.... my guess is that neither 2 680s or 7970s (even at 1350Mhz) will be able to run that resolution at such a high refresh rate without some good cutoff on eyecandy. But please elaborate on your "awful" definition. It was pure lack of raw power (because of it), vram bottleneck or another glitch?


----------



## stren

Quote:


> Originally Posted by *iCrap*
> 
> Vega where are you finding these awesome desktop backgrounds large enough to cover all your screens? I have such a hard time finding pictures for mine..


After all the things you have seen, this is your question?
.


----------



## iCrap

Lmao, but yes. It's impossible to find pictures at that size...


----------



## CallsignVega

Quote:


> Originally Posted by *thecrim*
> 
> I'm currently using one Lightning and I'm sure they won't disappoint you!


What kind of clocks have you gotten it up to?
Quote:


> Originally Posted by *zdude*
> 
> X79?


I just may be going with the Extreme11 X79 if I don't switch over to 7990's/7970 X2's.
Quote:


> Originally Posted by *iCrap*
> 
> Vega where are you finding these awesome desktop backgrounds large enough to cover all your screens? I have such a hard time finding pictures for mine..


A lot of searching on Google. You do know Google's image section let's you select minimum image sizes right?








Quote:


> Originally Posted by *armartins*
> 
> It's not really a surprise 2 680s not running well enough (even MORE considering your standards).You're almost with the same amount of pixels that you had with your 3x30" and 4 way SLI'd 3GB 580s and that had more raw power than 2 680s for sure.... my guess is that neither 2 680s or 7970s (even at 1350Mhz) will be able to run that resolution at such a high refresh rate without some good cutoff on eyecandy. But please elaborate on your "awful" definition. It was pure lack of raw power (because of it), vram bottleneck or another glitch?


With the 680's it wasn't just low FPS. It was horrible stuttering and hitching. Wasn't cause of VRAM either. The 2x 680's were running really bad even with every BF3 setting and every NVIDIA control panel setting to low/fastest possible. I honestly think we are a generation or two away from four GPU's being able to push five Catleaps at 100+ FPS. I have four MSI 7970 Lightnings inbound to see if they can run three Catleaps at 110+ FPS. I may be "stuck" at only three Catleap's. :EEK:

But I guess that is not a bad place to possibly be stuck at.


----------



## thecrim

This is my second Lightning, I RMA'd my last as it was faulty - running too hot and wouldn't allow anything above 1920x1200.

The one I received today, which has an ASIC of 59.8% reached 1300/1800 @ 1.35/1700v on air so far using the beta drivers you posted this morning. Benched circa 70C. Games at 50C at much lower clocks.

Benches from today all at 1920x1080:

Heaven
http://i.imgur.com/bHPID.png ( I know it's not a screenshot next to the rocky corridor to prove extreme tes but take my word for it)

3DMark11PP
http://3dmark.com/3dm11/3714589

3DMark11 EP
http://3dmark.com/3dm11/3714980

Screenshot of Settings
http://i.imgur.com/Xd0qO.png


----------



## CallsignVega

1300/1800, that's not too bad. I wonder if my four Lightnings will be able to approach that when in 4-way crossfire and not fall completely on their faces like my 680's to in SLI.


----------



## thecrim

You ordered another 2?! geez when will you have them by?

Please do a ton of benchmarks with them.

I've seen one other person reach 1300/1800 with these and at 1.25v. I don't know how well crossfire copes with cards which can be so different. I haven't delved into crossfire yet as I'd want to be pretty certain the second card had similar characteristics as my first. Not sure if that's too much of a problem under water.

EDIT: You ordered any of these http://kingpincooling.com/forum/showthread.php?p=22178#post22178? yet?


----------



## CallsignVega

Ya, decided to get four, why not.







These monitors demand some serious GPU power at this high of refresh rate. If the Lightnings work good I'll get four EK water-blocks. They are suppose to release here in the next week or two. Supposedly the Lightnings will do a lot better on water than air with the lower ASIC levels. All of the Lightnings I've read about are below 75% ASIC so it seems MSI have binned the chips for the higher voltage leak and higher performance when under better cooling than air.

Not sure how my Sniper 3 will hold up under quad-8x PCI-E 3.0 with the PLX chip. If the Sniper 3 starts to bottleneck I may need to get the Extreme11 for quad-16x. I have a nice 3960X lying around here somewhere to use with that board.


----------



## renat77

*A complete disappointment for me was the edition of 7970 GHz. Stock cooler with more noisy, hot, and consumes more power. 7970 which is specially designed for 3-way SLI card would you recommend. I do not want hell to the temperature of the chassis, such as 670 cards. Of course not forget the performance.*


----------



## stren

Quote:


> Originally Posted by *CallsignVega*
> 
> Ya, decided to get four, why not.
> 
> 
> 
> 
> 
> 
> 
> These monitors demand some serious GPU power at this high of refresh rate. If the Lightnings work good I'll get four EK water-blocks. They are suppose to release here in the next week or two. Supposedly the Lightnings will do a lot better on water than air with the lower ASIC levels. All of the Lightnings I've read about are below 75% ASIC so it seems MSI have binned the chips for the higher voltage leak and higher performance when under better cooling than air.
> Not sure how my Sniper 3 will hold up under quad-8x PCI-E 3.0 with the PLX chip. If the Sniper 3 starts to bottleneck I may need to get the Extreme11 for quad-16x. I have a nice 3960X lying around here somewhere to use with that board.


The lightnings are compatible with EK? I didn't think they handled 5 monitors though? I was interested to see your struggles with the 680's. I had planned to wait for GK110 to move from the 580 3gb cards, but now I wonder if I should copy you and break my nvidia habit.


----------



## xxbassplayerxx

Quote:


> Originally Posted by *stren*
> 
> The lightnings are compatible with EK? I didn't think they handled 5 monitors though? I was interested to see your struggles with the 680's. I had planned to wait for GK110 to move from the 580 3gb cards, but now I wonder if I should copy you and break my nvidia habit.


EK is making waterblocks specifically designed for the Lightning cards.


----------



## CallsignVega

Quote:


> Originally Posted by *renat77*
> 
> *A complete disappointment for me was the edition of 7970 GHz. Stock cooler with more noisy, hot, and consumes more power. 7970 which is specially designed for 3-way SLI card would you recommend. I do not want hell to the temperature of the chassis, such as 670 cards. Of course not forget the performance.*


It depends on if you want to water cool or not. When you go above two cards, you have to worry about sandwich issues with airflow (depending on motherboard). And if you go non-reference water cooling, there are only a few options.


----------



## CallsignVega

*VS*










Who will win this struggle? Pure brute power versus high resolution and high refresh rate?

Stay tuned for 2x GTX 680's vs 2x Lightning 7970's vs 4x Lightning 7970's running three 2B Catleap's at 4320x2560 @ 110 Hz benchmarks.


----------



## xxbassplayerxx

I pick the 7970s!


----------



## CallsignVega

Quote:


> Originally Posted by *xxbassplayerxx*
> 
> I pick the 7970s!


I surely hope you're right.


----------



## stren

Quote:


> Originally Posted by *CallsignVega*
> 
> I surely hope you're right.


+1 otherwise 5 monitors will be pointless unless you turn down the settings.

Which board are you running the test on?


----------



## CallsignVega

Quote:


> Originally Posted by *stren*
> 
> +1 otherwise 5 monitors will be pointless unless you turn down the settings.
> Which board are you running the test on?


Sniper 3, so 2x GPU's are running at 16x/16x 3.0, and then four Lightnings will be at quad-8X 3.0.


----------



## CallsignVega

This is bad, anyone want to buy some 2B Catleaps? lol


----------



## nvidiaftw12

Quote:


> Originally Posted by *CallsignVega*
> 
> This is bad, anyone want to buy some 2B Catleaps? lol


If I had the money yes. So, I guess another diversion in this never ending buildlog?


----------



## Penryn

Quote:


> Originally Posted by *CallsignVega*
> 
> This is bad, anyone want to buy some 2B Catleaps? lol


This doesn't sound good at all.


----------



## Quest99

Bad as in BAD!?!?!


----------



## stren

Quote:


> Originally Posted by *CallsignVega*
> 
> This is bad, anyone want to buy some 2B Catleaps? lol


Wow I didn't expect that outcome - how bad is bad? Is it limited by:
- vram
- pci slot bw
- cpu
- or the gpu processing capabilities?

I wanted to do this setup, but if Vega can't make it happen then I know I can't haha. I'm just confused because you had 3x2560x1600 displays working respectably before right? This is less resolution than that although now you have a higher potential frame rate, but still you should be able to beat the frame rates on that setup at least?


----------



## l88bastar

maple bacon


----------



## Samurai Batgirl

Vega, I think you should be stubborn about using the Catleap monitors. YOU MUST MAKE IT WORK!


----------



## nvidiaftw12

Quote:


> Originally Posted by *Samurai Batgirl*
> 
> Vega, I think you should be stubborn about using the Catleap monitors. YOU MUST MAKE IT WORK!


Srsly. You have switched brands of graphics cards like 5 times on the same node!


----------



## CallsignVega

Still doing some more testing. Seems there are a ton of quirks. Like Crysis 2 crashes with DX11 but runs pretty good DX9. BF3 seems a little better with 12.6 vs 12.7 but I wouldn't call it very smooth.

Witcher 2 is just abysmal with both. Shows 100 FPS but plays like 15 FPS. WoW in quad-fire gets less FPS than 2 cards and has unplayable hitching. In 2 cards it runs great though. Skyrim will only use a max of 2 GPU's no matter what I do.

Diablo 3 runs great though In Quad fire LOL I get over 220 FPS with all settings maxed.

So far a mixed bag, not overly impressed. Also might be a bit of me losing interest in multi-screen gaming. Eyefinity was a pain to set up. Had to use the custom .sys file to get past the pixel clock limit and I found out the 330 MHz Accel adapters don't all overclock the same. Then I get the standard AMD "DP link degradation blah blah" with the adapter 6 inches from the DP output. Don't even know how that's possible. Then the monitor goes black and/or glitches show up.

I might just pick the two fastest Lightnings and run those single screen. Going to try tr-fire now and see what I get.

In testing 2x 7970's beat out 2x 680's at this resolution besides Skyrim (first number is FPS, second VRAM usage):

Skyrim - All Ultra/4x MSAA

2x 680- 72/1863
2x 7970- 55/1942
4x 7970- Doesn't work

BF3 - Texture H/Shadow M/Effects L/Mesh L/Terrain L/Decor L/MSAA 0X/FXAA H/BLur Off/AF 16X/HBAO

2x 680- 59/1848
2x 7970- 69/1876

Diabo III - All in-game settings maxed

2x 680- 109/1242
2x 7970- 112/1247
4x 7970- 224

Metro2033 - Normal/0x MSAA

2x 680- 82/1323
2x 7970- 93/1342

Witcher 2 Enhanced - Medium

2x 680- 39/1491
2x 7970- 49/1377
4x 7970- 92 - Stuttering mess unplayable

Crysis 2 - High/DX11/Texture Pack

2x 680- 60/2025
2x 7970- 63/2515
4x 7970- DX11 Crash, DX9 works well 110 FPS

WoW - Max/4x MSAA

2x 680- 107/1702
2x 7970- 134/1675
4x 7970- 126 - Low GPU usage and hitching

I'd say 2x 7970's at stock average around 5-20% faster speed than 2x 680's at stock.


----------



## thecrim

What were the ASIC ratings of the 2 you decided to use for the tests? and what did you get them to? +their voltage?

Still think you should hang onto them until the extreme11 comes along other than that, I'd be interested in a catleap









Don't bludgeon my expectations of this Accell adapter by saying they don't all clock the same!! how close did the others get to your highest clocked adapter?

Atleast you came out of this with a dedicated D3 system.


----------



## Acefire

Quote:


> Originally Posted by *CallsignVega*
> 
> Still doing some more testing. Seems there are a ton of quirks. Like Crysis 2 crashes with DX11 but runs pretty good DX9. BF3 seems a little better with 12.6 vs 12.7 but I wouldn't call it very smooth.
> Witcher 2 is just abysmal with both. Shows 100 FPS but plays like 15 FPS. WoW in quad-fire gets less FPS than 2 cards and has unplayable hitching. In 2 cards it runs great though. Skyrim will only use a max of 2 GPU's no matter what I do.
> Diablo 3 runs great though In Quad fire LOL I get over 220 FPS with all settings maxed.
> So far a mixed bag, not overly impressed. Also might be a bit of me losing interest in multi-screen gaming. Eyefinity was a pain to set up. Had to use the custom .sys file to get past the pixel clock limit and I found out the 330 MHz Accel adapters don't all overclock the same. Then I get the standard AMD "DP link degradation blah blah" with the adapter 6 inches from the DP output. Don't even know how that's possible. Then the monitor goes black and/or glitches show up.
> I might just pick the two fastest Lightnings and run those single screen. Going to try tr-fire now and see what I get.
> In testing 2x 7970's beat out 2x 680's at this resolution besides Skyrim (first number is FPS, second VRAM usage):
> Skyrim - All Ultra/4x MSAA
> 2x 680- 72/1863
> 2x 7970- 55/1942
> 4x 7970- Doesn't work
> BF3 - Texture H/Shadow M/Effects L/Mesh L/Terrain L/Decor L/MSAA 0X/FXAA H/BLur Off/AF 16X/HBAO
> 2x 680- 59/1848
> 2x 7970- 69/1876
> Diabo III - All in-game settings maxed
> 2x 680- 109/1242
> 2x 7970- 112/1247
> 4x 7970- 224
> Metro2033 - Normal/0x MSAA
> 2x 680- 82/1323
> 2x 7970- 93/1342
> Witcher 2 Enhanced - Medium
> 2x 680- 39/1491
> 2x 7970- 49/1377
> 4x 7970- 92 - Stuttering mess unplayable
> Crysis 2 - High/DX11/Texture Pack
> 2x 680- 60/2025
> 2x 7970- 63/2515
> 4x 7970- DX11 Crash, DX9 works well 110 FPS
> WoW - Max/4x MSAA
> 2x 680- 107/1702
> 2x 7970- 134/1675
> 4x 7970- 126 - Low GPU usage and hitching
> I'd say 2x 7970's at stock average around 5-20% faster speed than 2x 680's at stock.


Lol. Wow it seems I was right after all. Thanks Buddy.


----------



## Penryn

Time for me to go buy a second 7970...


----------



## stren

Wow didn't realize things were so buggy. Seems like both the 680 and 7970 have kinks to work out, so I'm somewhat glad I have my 580 3gb cards. Maybe by the time the GK110 comes out nvidia will have resolved their issues. Surprised how low the vram usage is. Seems like you need a texture pack to break 2gb, and even then 3gb should be plenty.

Maybe for now I'll wait haha. Seems with the new macbook retina display that we might see a company push some higher resolution desktop displays


----------



## Bigbrag

This is why I just purchased 3 standard Catleaps. I got them for $289 a piece and can game in portrait on IPS goodness with a single 6970. I'm getting 30-60fps in most games at High detail with no AA. I will eventually upgrade to a single 7970, or 7990 when the prices drop. My previous setup was 7680x1080 @ 120hz and I don't think I'm missing out on anything @ 60hz now. I think the 120hz catleaps having to be overclocked, plus the adapters and cables having to be overclocked is just too much ambition. Cool idea, but a ton of draw backs to make it enjoyable. I already have to disconnect my adapters to get them to reset and a semi annual basis.


----------



## CallsignVega

Quote:


> Originally Posted by *stren*
> 
> Wow didn't realize things were so buggy. Seems like both the 680 and 7970 have kinks to work out, so I'm somewhat glad I have my 580 3gb cards. Maybe by the time the GK110 comes out nvidia will have resolved their issues. Surprised how low the vram usage is. Seems like you need a texture pack to break 2gb, and even then 3gb should be plenty.
> Maybe for now I'll wait haha. Seems with the new macbook retina display that we might see a company push some higher resolution desktop displays


Ya, the 680's didn't run BF3 too well either. Just too many pixels to push at to fast a refresh rate IMO.

Two 7970's are working REALLY well on a single overclocked catleap. Super smooth. They also run the 3x1 portrait setup in WoW really well. I don't play WoW, just testing for Guild Wars 2 as they run about the same.

I've noticed this stand has a really nice feature. The hinges rotate so I can switch from a 3x1 portrait setup to a 1x landscape center screen setup really easy! So basically I can have a 1x landscape screen for game like BF3 that don't work so well on the monster setup, and can convert to the 3x1 portrait setup for game like GW2 in less than 30 seconds. That combined with the super smoothness and simplicity of 2x 7970's is working well so far.

5x1 portrait is definitely out of the question. I don't know what I was smoking considering that idea LOL.

But if I go with this 3x1 portrait, 1x landscape overclocked Catleap hybrid I will get rid of the Lightnings (thank you Amazon easy return policy) and get two Asus ROG Matrix 7970's. The logic behind this is that the Lightnings only have single link DVI ports, so even on the single Catleap I have to run the DP to DVI adapter which limits me to around 109 Hz. With the Matrix 7970's dual-link DVI port I can run my best Catleap at 125+ Hz in the center and run the two side Catleaps off of two of the DP's at the ~109 Hz. Since I guarantee no one with make a water block for the Matrix, I would just do a custom GPU water block and put heatsink's on all the memory and VRMs.


----------



## CallsignVega

Forgot to mention that with the Catleaps and the DP-DVI adapters the BIOS and post process does not show up. Another reason to go with a card that has a native DL-DVI connection like the Matrix 7970. I'm not having a separate monitor on my desk just to be able to get into the BIOS, that's ******ed.


----------



## l88bastar

Quote:


> Originally Posted by *Acefire*
> 
> Poo Poo Pancakes hardcore bottleneck me.


What the??? Dude, I dont even wanna know.....Too much info man!


----------



## Bigbrag

That's strange that you don't see bios and post processes on your catleaps with your displayport adapters. I have 2 monoprice adapters and I can see my post processes and enter the bios no problem. My dp adapters actually turn the screens on faster than the single monitor run by the dual dvi input.


----------



## CallsignVega

Quote:


> Originally Posted by *Bigbrag*
> 
> That's strange that you don't see bios and post processes on your catleaps with your displayport adapters. I have 2 monoprice adapters and I can see my post processes and enter the bios no problem. My dp adapters actually turn the screens on faster than the single monitor run by the dual dvi input.


Ya, I'm not sure whats going on here. When the computer posts the monitors are not overclocked so I don't know what the deal is. Are you using the same 330 MHz Accell adapters that I am? (Not sure if monoprice sells them or are you talking about monoprice brand?) This particular adapter I am using has a much faster pixel clock chip in it than all the others.


----------



## CallsignVega

Not entirely convinced on Z77 16x lanes being split into four 8x lanes with the PLX chip. Just to rule that out I bought a gun to play with my four 7970 Lightning's:










Oh, and five of these on the right:










The screen setup was right at the time, it was the GPU's that weren't. So I will be trying 5x1 portrait again, but this time with the Samsung's and not the Catleap's. Catleap's just have too many pixels to push in 3-5x setups.


----------



## nvidiaftw12

Finnally! Wondered when you would get that board. :thumb:


----------



## CallsignVega

Ya, looks like a really great board. They've also come out with water blocks for the board. I like how it is a full size board, so my #1 GPU won't be smashed up against the RAM slots like on the RIVE. That was annoying.


----------



## Quest99

Looks so much better without the bullets! I guess you were tired of waiting for the extreme 11....


----------



## stren

Quote:


> Originally Posted by *CallsignVega*
> 
> Ya, looks like a really great board. They've also come out with water blocks for the board. I like how it is a full size board, so my #1 GPU won't be smashed up against the RAM slots like on the RIVE. That was annoying.


So it doesn't have the PLX lane splitters so you'll be running the test with native pci-e3x8 now? BTW how do you like it vs the R4E apart from the ram slots?


----------



## l88bastar

Quote:


> Originally Posted by *CallsignVega*
> 
> Ya, looks like a really great board. They've also come out with water blocks for the board. I like how it is a full size board, so my #1 GPU won't be smashed up against the RAM slots like on the RIVE. That was annoying.


Fecal matter, at said moment, has been transcended from abstract theory,﻿ into a more literal configuration
http://www.youtube.com/watch?v=IUH3JQjcweM

and I just watched this...what did I just watch??? Can I has new eyeballz??
http://www.youtube.com/watch?v=dWpeBNCYU3g&feature=related


----------



## CallsignVega

Quote:


> Originally Posted by *Quest99*
> 
> Looks so much better without the bullets! I guess you were tired of waiting for the extreme 11....


I still may test the Extreme11, but that isn't suppose to be out until August. I am curious to test these four 7970 Lightning's on native PCI-E 3.0 without the PLX chip first.
Quote:


> Originally Posted by *stren*
> 
> So it doesn't have the PLX lane splitters so you'll be running the test with native pci-e3x8 now? BTW how do you like it vs the R4E apart from the ram slots?


Yes, the Big Bang will have the native X79 direct to CPU 3.0 lanes of 16x/8x/8x/8x. That is versus the Sniper 3's 16x 3.0 to CPU lane split between 8x/8x/8x/8x with the PLX chip. Who know's, it may very well make zero difference. Can't know for sure until you test!


----------



## stren

Quote:


> Originally Posted by *CallsignVega*
> 
> I still may test the Extreme11, but that isn't suppose to be out until August. I am curious to test these four 7970 Lightning's on native PCI-E 3.0 without the PLX chip first.
> Yes, the Big Bang will have the native X79 direct to CPU 3.0 lanes of 16x/8x/8x/8x. That is versus the Sniper 3's 16x 3.0 to CPU lane split between 8x/8x/8x/8x with the PLX chip. Who know's, it may very well make zero difference. Can't know for sure until you test!


Yup - I'm always wary of pci lane splitters over native lanes unless you need more than can be supplied. It'd be nice to see proof to justify or refute my scepticism!


----------



## armartins

With the Samsung's you can eliminate the active adaptors and run only mini-DP to DP cables... less hassle, less bezels... This is going to be pretty interesting...


----------



## CallsignVega

Quote:


> Originally Posted by *armartins*
> 
> With the Samsung's you can eliminate the active adaptors and run only mini-DP to DP cables... less hassle, less bezels... This is going to be pretty interesting...


Yup, no adapters and no special modded .sys file every time a new driver comes out. Just plug them in and go at 120 Hz.









Of course everything has a trade off though. Going to miss the awesome viewing angle of the Catleap's. The 5x1 Samsung setup will have a slight wrap-around to it due to the viewing angles. It isn't as much of a problem with them in 3x1 mode though. Then again I will have a bit faster pixel response time with the Samsung's.

Although DP has a pretty weak signal compared to DVI, I may have to get a bunch of these cables since I am about 25 feet from the computer:

http://estore.circuitassembly.com/products/long-haul-24-awg-mini-displayport-extension-cable-25-foot-35-foot-50-foot.html

24 AWG worked so well with the DVI cable and got me 5-10 more Hz overclock on the Catleap's. This is the only DP cable I could find that uses 24 gauge.


----------



## stren

I for one am sad the catleaps did not work out. Would have been truly epic


----------



## zdude

Vega you should start sending me all the hardware you are buying then using for a week and letting sit on a shelf


----------



## CallsignVega

Quote:


> Originally Posted by *stren*
> 
> I for one am sad the catleaps did not work out. Would have been truly epic


The 5x1 Catleap would work IMO if you played not-so demanding games only. Like I bet Diablo 3 would work awesome. I have very strict performance/smoothness standards and 18.4 million pixels at 100+ FPS is pretty crazy to accomplish. Although, just in case something changes with the X79 board, my 4x 2B Catleap's aren't going anywhere until I confirm the Samsung's are working better.
Quote:


> Originally Posted by *zdude*
> 
> Vega you should start sending me all the hardware you are buying then using for a week and letting sit on a shelf


Only if you pay for shipping.


----------



## nvidiaftw12

Quote:


> Originally Posted by *CallsignVega*
> 
> Only if you pay for shipping.


Pretty sure that is the best damn deal ever. Sign me up!


----------



## Swolern

Hey CallsignVega, i met you on a different forum. Looks as if your the guru of 3/4way sli on these new 600 cards. So i have been trying to alleviate the low GPU utilization with my gtx 670 4-way sli z77 setup (G1 Sniper3/ i7 3770k @ 4.88 / 670x4 2gb / 8gb ram / Corsair Pro 1050w / x3 Asus 3d monitors) EVGA Classified 1200w on order. (no psu issues so far, i just want to be safe with higher OCs)

I have been testing multiple setups with different speeds. Looks as if it is a CPU bottleneck and not the bandwidth limitation of Z77. What do you think? For testing i put the same Empty online map and putting character in exact same spot and looking at exact same point in the map. The low GPU utilization corresponds with the beginning of Swordbreaker campaign like you tested in your pci-e 2.0 vs 3.0 youtube vids.









All testing was done in 5760x1080 2d 120hz

*** Vega settings:
MSAA- off
Effects Quality- low
Terrain Quality- low
Terrain Decor- low
Motion Blur- off
Ambient Occlusion- off
All other settings- ultra

As you can see GPU utilization goes to high 90% when stressing the GPU more with 2xMSAA and ultra settings. So this is showing CPU bottleneck, correct? I made some progress switching to the 3770k and Ocing it with GPU utilization and FPS going up in your settings, but im not there yet. Is there anything that you can suggest i do to help me get to 90% utilization at your settings with this platform, or should i start from scratch with the X79 platform? I can still exchange my CPU and mobo with a 15% restocking fee loss.


----------



## CallsignVega

Yes, that looks like a nice progression in FPS numbers as the CPU clock get's higher and higher. In those numbers with the mid 90's usage that's about as high as your going to get. It's almost impossible to get quad-99% due to the SLI overhead. Those numbers actually look really good.

The only reason you are getting ~80% usage on your top right is due to CPU maxed out. You can compare with the bottom right numbers around the same FPS, as the settings are higher and pushing the GPU's harder than the top right.

Can I ask, have you played BF3 on any of the new CQ maps? Or conquest Metro? I find those maps particularly difficult to get really smooth, even if the FPS numbers are high. I am more interested in smoothness of frame delivery versus max FPS numbers.


----------



## zdude

Quote:


> Originally Posted by *CallsignVega*
> 
> The 5x1 Catleap would work IMO if you played not-so demanding games only. Like I bet Diablo 3 would work awesome. I have very strict performance/smoothness standards and 18.4 million pixels at 100+ FPS is pretty crazy to accomplish. Although, just in case something changes with the X79 board, my 4x 2B Catleap's aren't going anywhere until I confirm the Samsung's are working better.
> Only if you pay for shipping.


Done, do you want me to PM you my address?


----------



## thecrim

Quote:


> Originally Posted by *zdude*
> 
> Done, do you want me to PM you my address?


+1


----------



## stren

Quote:


> Originally Posted by *CallsignVega*
> 
> The 5x1 Catleap would work IMO if you played not-so demanding games only. Like I bet Diablo 3 would work awesome. I have very strict performance/smoothness standards and 18.4 million pixels at 100+ FPS is pretty crazy to accomplish. Although, just in case something changes with the X79 board, my 4x 2B Catleap's aren't going anywhere until I confirm the Samsung's are working better.


Yeah - here's to hoping GK110 can handle 3 and that nvidia sorts their SLI issues...


----------



## Swolern

Quote:


> Originally Posted by *CallsignVega*
> 
> The only reason you are getting ~80% usage on your top right is due to CPU maxed out. You can compare with the bottom right numbers around the same FPS, as the settings are higher and pushing the GPU's harder than the top right.
> Can I ask, have you played BF3 on any of the new CQ maps? Or conquest Metro? I find those maps particularly difficult to get really smooth, even if the FPS numbers are high. I am more interested in smoothness of frame delivery versus max FPS numbers.


Yes the online testing in the graph i posted earlier was actually from the new CQ map Scrapmetal. Even though framerate is about the same with FXAA & 2xMSAA, FXAA is much smoother. With the new CQ maps framerates range from 70fps up to 150fps depending on situation. I have been getting use to these amazing 120hz monitors and smoothness is very important to me also. With FXAA CQ maps are very smooth 80% off the time, looking smooth when fps>80, amazing >100fps, and absolutely stunning >120fps. When <80fps gameplay starts getting a little choppy/laggy. Never thought i would say choppy about 80fps, but these are my first 120hz monitors and they sure do make you scale smoothness/lag at a whole new perspective.

So my current goal is a constant fps above 100fps in BF3 multiplayer, 120fps constant would be better is possible. How is your framerate with BF3 multiplayer with your 3960x / x79/ 680 4-way?

Also most other AAA games run horrible even with CPU OC @ 4.88ghz.. Crysis 2 @ 50% GPU utilization, Metro 2033 as low as 30% GPU utilization! Batman AC is just a complete mess with frame rates and GPU use all over the place. Do these games run better for you? Or are these issues driver related?

Heavily contemplating going with 3930k / Asus rampage IV X79 even though i would be loosing some cash. Would i get @ the same gaming performance with the 3930 @ 5ghz vs your 3960x ?

Plus this Gigabyte g1 sniper board has been one of the most unstable boards ive had, even at stock speeds. Should have stayed with Asus.


----------



## nvidiaftw12

Quote:


> Originally Posted by *zdude*
> 
> Done, do you want me to PM you my address?


+2


----------



## CallsignVega

Quote:


> Originally Posted by *zdude*
> 
> Done, do you want me to PM you my address?


Addresses received. You should expect your applicable package(s) before the weekend.
Quote:


> Originally Posted by *Swolern*
> 
> Yes the online testing in the graph i posted earlier was actually from the new CQ map Scrapmetal. Even though framerate is about the same with FXAA & 2xMSAA, FXAA is much smoother. With the new CQ maps framerates range from 70fps up to 150fps depending on situation. I have been getting use to these amazing 120hz monitors and smoothness is very important to me also. With FXAA CQ maps are very smooth 80% off the time, looking smooth when fps>80, amazing >100fps, and absolutely stunning >120fps. When <80fps gameplay starts getting a little choppy/laggy. Never thought i would say choppy about 80fps, but these are my first 120hz monitors and they sure do make you scale smoothness/lag at a whole new perspective.
> So my current goal is a constant fps above 100fps in BF3 multiplayer, 120fps constant would be better is possible. How is your framerate with BF3 multiplayer with your 3960x / x79/ 680 4-way?
> Also most other AAA games run horrible even with CPU OC @ 4.88ghz.. Crysis 2 @ 50% GPU utilization, Metro 2033 as low as 30% GPU utilization! Batman AC is just a complete mess with frame rates and GPU use all over the place. Do these games run better for you? Or are these issues driver related?
> Heavily contemplating going with 3930k / Asus rampage IV X79 even though i would be loosing some cash. Would i get @ the same gaming performance with the 3930 @ 5ghz vs your 3960x ?
> Plus this Gigabyte g1 sniper board has been one of the most unstable boards ive had, even at stock speeds. Should have stayed with Asus.


I've noticed the CQ maps and Metro are much less "smooth" than the outdoor maps (multi-monitor only). Not sure if it has to do with the dust/smoke blowing around or what. Anyone else notice this?

I never really run MSAA. Always seems more laggy under multi-monitor. FXAA seems to work a bit better. I understand what you mean with the 80 FPS reference. 80 FPS sigle monitor would appear very smooth. But add in the multi-screen-multi-GPU quirks, 80 FPS can feel quite choppy. Honestly I didn't play a lot of BF3 on the 4-way 680 setup. I don't remember.

I remember Crysis 2 and Metro ran really well. Definitely sounds like there is some sort of problem if you are seeing that low of utilizatnioin those pretty demanding game. Especially at that high of a CPU clock. At the same frequency, the 3930K should run pretty much the same as the 3960X. 3960x just has 3MB more cache is all.

My X79 Big Bang arrives before the weekend and will be doing a lot of testing if you don't mind waiting to see if it makes a difference for me. Although if you really want the X79, I don't see how it could end up worse. Just be prepared to activate PCI-E 3.0 via registry edit/NVIDIA .exe.


----------



## nvidiaftw12

Quote:


> Originally Posted by *CallsignVega*
> 
> Addresses received. You should expect your applicable package(s) before the weekend.


Wait what? You're really sending people stuff for only shipping?


----------



## csm725




----------



## thecrim

Hey this isn't right. He said "Addresses" and I didn't send an address and neither did nvidiaftw12!

I knew it was too good to be true.


----------



## nvidiaftw12

Quote:


> Originally Posted by *thecrim*
> 
> Hey this isn't right. He said "Addresses" and I didn't send an address and neither did nvidiaftw12!
> I knew it was too good to be true.


Take it easy, bro. We weren't the first to ask, and he might have accidentally added the es.


----------



## thecrim

* frantically f5'ing 120hz.net*


----------



## Swolern

Quote:


> Originally Posted by *CallsignVega*
> 
> I remember Crysis 2 and Metro ran really well. Definitely sounds like there is some sort of problem if you are seeing that low of utilizatnioin those pretty demanding game. Especially at that high of a CPU clock. At the same frequency, the 3930K should run pretty much the same as the 3960X. 3960x just has 3MB more cache is all.
> My X79 Big Bang arrives before the weekend and will be doing a lot of testing if you don't mind waiting to see if it makes a difference for me. Although if you really want the X79, I don't see how it could end up worse. Just be prepared to activate PCI-E 3.0 via registry edit/NVIDIA .exe.


I cant wait too much longer, i have 5 days left to return my G1Sniper3 mobo and 3770k. But ive decided to go ahead and go with the x79 and 3930k.(just got to look for a good price). I love the Asus boards so i think im going with the Rampage IV Extreme like you.

If you or anyone else have any extra X79 boards or SB-E processors i would be willing to trade (or pay extra) for my EVGA GTX 680 reference card(that im not using and was going to sell). If you or anyone else needed an extra GTX 680 Please PM me. I also have a Corsair 1050w pro for trade or sell. I have them on ebay, but hate their fees!

BTW thanks for all the help Vega, you' ve been very helpful.


----------



## CallsignVega

The RIVE is a good board but I wanted to change it up with a larger board. So I have the MSi BIg Bang inbound. There is actually a guy selling a big bang with water block already on it in the for sale section. That is, if you water cool. Actually he put the stock heat sink on it so you may be able to get just the board for cheaper if you are looking to save some money and not water cool.


----------



## zdude

Quote:


> Originally Posted by *nvidiaftw12*
> 
> Take it easy, bro. We weren't the first to ask, and he might have accidentally added the es.


I sent mine


----------



## Bigbrag

Those samsungs are sweet! To answer your earlier question, I'm using the monoprice brand mini do to dual link dvi adapters. I'm running a 6970 that has mini dp's so that's why I chose those adapters. I was surprised how good max Payne 3 ran on my three catleaps when I tried it today. Everything all the way up with 2x AA and I was getting 60fps on a single 6970. I used to be more aggressive with my pc purchases. I had several sr-2's and 4 480s's when they were released then 2 590's when those were released. I was just unhappy with every big purchase I made so now I'm just waiting it out for the current gen to drop prices and investing into screens, speakers and other accessories. I'm still debating debezeling my catleaps and running portrait but I'm not sure how to mount them. If you are selling your custom mounts or can pm the specifics on how you mounted them that would be great.


----------



## CallsignVega

Quote:


> Originally Posted by *Bigbrag*
> 
> Those samsungs are sweet! To answer your earlier question, I'm using the monoprice brand mini do to dual link dvi adapters. I'm running a 6970 that has mini dp's so that's why I chose those adapters. I was surprised how good max Payne 3 ran on my three catleaps when I tried it today. Everything all the way up with 2x AA and I was getting 60fps on a single 6970. I used to be more aggressive with my pc purchases. I had several sr-2's and 4 480s's when they were released then 2 590's when those were released. I was just unhappy with every big purchase I made so now I'm just waiting it out for the current gen to drop prices and investing into screens, speakers and other accessories. I'm still debating debezeling my catleaps and running portrait but I'm not sure how to mount them. If you are selling your custom mounts or can pm the specifics on how you mounted them that would be great.


Have you seen this pic:










Pretty much explains everything on how I mounted the Catleap's.









On another note, I forgot how awesome the Samsung's are de-bezeled with a 6 millimeter (under 1/4 inch) gap border:










I found that even using single link DVI connection of the 7970 Lightnings (165 MHz), I can get around 75 Hz out of the 700D (right landscape monitor). So it will still be better than 60 Hz and not as noticeable as it will be an end screen on the 5x1 setup. Eyefinity is great in that it will allow four screens to run at 120Hz and an odd ball screen to run at a lower Hz.

This is the Vega monitor assembly line, five Samsung's to de-bezel and mount. Four Catleap's to put back-together







:










So far in testing 3x1 Samsung's the Quad-Fire 7970's are working great. More to follow on 5x1 testing.


----------



## l88bastar

WOW VEGA THIS CATLEAP IS AWESOME!!!

THANK YOU VERY MUCH FOR SHIPPING IN SUCH A TIMELY MANNER!!!

Question, can I send you another address to ship another one to my brother in toledo or is it just one catleap per OCN member? I PMd you his address just in case and paypaled you my estimated shipping cost


----------



## Djghost454

Quote:


> Originally Posted by *CallsignVega*
> 
> Only if you pay for shipping.


If you have stuff you don't want, I own a truck and live in within driving distance of Fort Rucker, lol.


----------



## nathanak21

Any other stuff?


----------



## l88bastar

Hey Vega first I just want to say you have been super awesome about sending me the Catleap and also I see that your shipping my brothers as well and I just want to thank you again. Man you have been so generous, I am so happy I was able to RSVP some of your awesome gear in time!!

Anyway, I don't want to seem ungrateful, because you are awesome, but I am having a couple of issues with the FW900 you sent me last month...actually, let me be honest....OK...I'm not really having any issues with the FW900, I mean it is an awesome A grade FW900 and I know I shouldn't complain because you did give it to me for free (except for the $200 shipping that I paid), but you see the thing is I like the Catleap more (now that I have them side by side) and I was wondering if I could send you the FW900 back and have you send me out another catleap instead. I hope I am not offending you because you did go through all of that time custom painting the bezel for me (thanks again for that!), but the thing is now that I have used the Catleap for a bit, I just prefer the sleeker design and colors of the IPS panel over the heavy CRT.

Here is a picture of the FW900 so you know its still in great shape and will be willing to take it back for an exchange:


----------



## Nocturin

Aww I missed out on the fun stuff







.

Oh well, still get to watch, at least.

Those who got things, have fun with them and post pictures!


----------



## stren

lulz


----------



## Baasha

Vega,

What stand is the one in your picture? Will that stand be able to take 3 30" monitors in portrait? If not, do you know of any good/high quality stand that will allow 3 30" monitors in portrait?


----------



## CallsignVega

Quote:


> Originally Posted by *Baasha*
> 
> Vega,
> What stand is the one in your picture? Will that stand be able to take 3 30" monitors in portrait? If not, do you know of any good/high quality stand that will allow 3 30" monitors in portrait?


Yes it can: http://www.wsgf.org/products/wsgf-edition-stand

Simply awesome stand.


----------



## l88bastar

Quote:


> Originally Posted by *CallsignVega*
> 
> Yes it can: http://www.wsgf.org/products/wsgf-edition-stand
> Simply awesome stand.


I ordered that stand today, even though I already had the 3x1 you gave to me for free, except for the shipping that I paid.


----------



## nathanak21

Hey vega, do you still have the fresnal lenses? Thinking about doing a build with some.


----------



## andrew grp

Hey Vega, I would be grateful if you could send me your debezzeled wallet along with 4-way Green cash Sli for free, of course I'm willing to pay for shipping.
Thanks in advance



Spoiler: Warning: Spoiler!



Quote:


> Originally Posted by *l88bastar*






On a serious note, why do the colors and especially black seem so off in the picture?


----------



## mlb426

Vega, are you going to debezel the 700d? I know you mentioned it was near impossible to vesa mount but was wondering if you found some way to debezel and mount in portrait


----------



## Snipe3000

Hi Vega,
Would it be possible to debezel and mount the 27" version of the Samsung monitor?


----------



## carramrod

Hey Vega, which ebay dealer/auction did you buy your catleaps from? I checked the topic but didn't seem to see that listed. Thanks.


----------



## xxbassplayerxx

Lots of new users in here. Did this thread get linked to from somewhere?


----------



## Nocturin

Quote:


> Originally Posted by *xxbassplayerxx*
> 
> Lots of new users in here. Did this thread get linked to from somewhere?


Probably, I started follow vega on XS, was pleased when I finally got linked to his threads here (a while back during the chiller build).

It is abnormal though, I hate seeing posts like "can I haz" pop up.


----------



## andrew grp

@xxbassplayerxx
People are always attracted to free things.

As for me, I couldn't keep up with so many posts that wanted to read so I had to use the subscription button ( my browser was screaming with all those bookmarks).


----------



## nathanak21

I have been folowing this thread for a while. Sorry if it comes out like I am just here for stuff.


----------



## xxbassplayerxx

Not at all, I was just wondering what caused the unusual amount of new members. It's not a bad thing, just curious.


----------



## Nocturin

Quote:


> Originally Posted by *andrew grp*
> 
> @xxbassplayerxx
> People are always attracted to free things.
> As for me, I couldn't keep up with so many posts that wanted to read so I had to use the subscription button ( my browser was screaming with all those bookmarks).


Glad you finally decided to participate


----------



## andrew grp

Glad your story with Antec is finally getting to a good end.

BTW When Vega is not here offtopic posts are skyrocketing...


----------



## Nocturin

Quote:


> Originally Posted by *andrew grp*
> 
> Glad your story with Antec is finally getting to a good end.
> BTW When Vega is not here offtopic posts are skyrocketing...


Thanks









Not to the end yet, but it's looking up.

I'd say 3/4 of the posts in the thread are off-topic, approximately. It's all for the fun though


----------



## Detroitsoldier

Whenever I need to show someone what a crazy, over-the-top, best-of-the-best system would look like, I usually reference pictures of Vega's from this thread.

Thanks for all of the amazing builds!


----------



## Nocturin

This.

"Oh, you want to see awesome?!"

*comes here


----------



## carramrod

Maybe, I've been lurking for a long time though, but never got around to making an account. Planning to build a high end battlestation similar to Vega's in a few months, after I do my research and planning. It's nice to see the trial and error he's doing because it helps me make better decisions on the parts to stay away from. Not many benchmarking sites try to test setups close to his, so it's good to see first hand results.


----------



## throne4me

epicness


----------



## nvidiaftw12

Mother of God.


----------



## l88bastar

Hey Vega, I live in a van down by the river and populate an alternate parallel universe most of the time. My spaceship is currently in the shop with a bad flux capacitor, but I would be willing to take some extra things off of your hands as long as you pinky swear to their quality and go halfsies on shipping with me


----------



## Nocturin




----------



## CallsignVega

Big Bang is coming online. This baby is sweet. 5x1 Samsung 120Hz setup is complete, should have some pics soon. Hopefully AMD drivers will play nice this time.


----------



## nathanak21

**fingers crossed**


----------



## Quest99

Cannot wait to see your new videos!


----------



## Baasha

Quote:


> Originally Posted by *CallsignVega*
> 
> Yes it can: http://www.wsgf.org/products/wsgf-edition-stand
> Simply awesome stand.


Okay, I don't mean to sound like a noob, but that WSGF edition says it comes with a "Five Segment Bar" (meant for 5 monitors). Also, it says it can take monitors *up to* 27". Does it need any modification to be able to fit 30" monitors? I would like to just assemble the stand, mount my monitors, and call it a day.

I want to order that thing ASAP but I just want to make sure I'm getting the right product.

Also, as a segue, why did you decide to go with the 2GB Lightnings instead of the 4GB Classifieds for your 5x1 setup? Don't you need as much VRAM as possible (like my setup)?


----------



## Swolern

5x1 120hz are going to be amazing. Wondering how the 7970s are going to handle that extreme resolution. Looking forward to some BF3 benches!









Keeping my fingers crossed for ya Vega @ the AMD drivers.


----------



## CallsignVega

Quote:


> Originally Posted by *Baasha*
> 
> Okay, I don't mean to sound like a noob, but that WSGF edition says it comes with a "Five Segment Bar" (meant for 5 monitors). Also, it says it can take monitors *up to* 27". Does it need any modification to be able to fit 30" monitors? I would like to just assemble the stand, mount my monitors, and call it a day.
> I want to order that thing ASAP but I just want to make sure I'm getting the right product.
> Also, as a segue, why did you decide to go with the 2GB Lightnings instead of the 4GB Classifieds for your 5x1 setup? Don't you need as much VRAM as possible (like my setup)?


The stand can take 30" monitors in portrait. Landscape might be stretching it. And the five segment bars aren't just there for five monitors. They allow three screens to wrap around more naturally than three segment bars.

I have 7970 Lightnings, not 680 Lightnings.

Got the Big Bang setup up. This has been a great motherboard to work with and the build quality is fantastic. This is the first motherboard this year out of 4 or 5 that I have setup that has detected both keyboards and a working mouse in the BIOS on the first try. Using BIOS 1.4B2 at the moment and haven't had a single hitch.

The Gatling gun VRM heat-sink is pretty subdued and looks decent. The ammo heat-sink does look pretty cheesy in pictures but in real life it looks a tad better. The tips actually look like real brass, but in pictures the whole thing looks like plastic. Although if I stick with the board I will most likely get water blocks so that will all be moot.

Even the X79 RAID was easy to get into and setup if you have the proper drivers. Something I learned the hard way with the RIVE!

UEFI is a pleasure to work with and nicely laid out. Both MSI and Gigabyte have great BIOS's now, easily up to par with Asus.










As for the 5x1 Display setup, having issues with all of the monitors being detected/displayed. Even in regular desktop mode, haven't even gotten to Eyefinity yet. I have 24 gauge super high quality Display Port cables arriving tomorrow so hopefully this will get resolved.


----------



## ssss69

Dos you notice differences between the samsung and the catleap imagen cuality?

Enviado desde mi GT-N7000 usando Tapatalk 2


----------



## CallsignVega

Quote:


> Originally Posted by *ssss69*
> 
> Dos you notice differences between the samsung and the catleap imagen cuality?
> Enviado desde mi GT-N7000 usando Tapatalk 2


The Samsung's are pretty good for TN panels. But ya, the IPS's higher resolution, colors and viewing angles are going to give it the edge. But the Samsung's have a much easier time in Eyefinity due to less pixels per screen, much smaller bezel gaps, have faster pixel response times and can reach 120 Hz with no special adapters/converters/edited drivers etc. So it's a trade-off.


----------



## csm725

Quote:


> Originally Posted by *ssss69*
> 
> Dos you notice differences between the samsung and the catleap imagen cuality?
> Enviado desde mi GT-N7000 usando Tapatalk 2


If you prefer speaking in Spanish, send me a PM and I'd be happy to help you with any other questions you have about the monitors.


----------



## Nocturin

Quote:


> Originally Posted by *CallsignVega*
> 
> The Samsung's are pretty good for TN panels. But ya, the IPS's higher resolution, colors and viewing angles are going to give it the edge. But the Samsung's have a much easier time in Eyefinity due to less pixels per screen, much smaller bezel gaps, have faster pixel response times and can reach 120 Hz with no special adapters/converters/edited drivers etc. So it's a trade-off.


Those are the samsung A850's right?

They're PLS or PVA, not TN







.


----------



## De-Zant

Quote:


> Originally Posted by *Nocturin*
> 
> Quote:
> 
> 
> 
> Originally Posted by *CallsignVega*
> 
> The Samsung's are pretty good for TN panels. But ya, the IPS's higher resolution, colors and viewing angles are going to give it the edge. But the Samsung's have a much easier time in Eyefinity due to less pixels per screen, much smaller bezel gaps, have faster pixel response times and can reach 120 Hz with no special adapters/converters/edited drivers etc. So it's a trade-off.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Those are the samsung A850's right?
> 
> They're PLS or PVA, not TN
> 
> 
> 
> 
> 
> 
> 
> .
Click to expand...

No, vega uses samsung s23a750ds and a s23a700d. They are 120hz TN displays.


----------



## fortunesolace

Quote:


> Originally Posted by *CallsignVega*
> 
> Got the Big Bang setup up. This has been a great motherboard to work with and the build quality is fantastic. This is the first motherboard this year out of 4 or 5 that I have setup that has detected both keyboards and a working mouse in the BIOS on the first try. Using BIOS 1.4B2 at the moment


Where did you download the BIOS 1.4B2?


----------



## CallsignVega

MSI German site.


----------



## CallsignVega

I wonder if I should get paid for this?


----------



## nvidiaftw12

dowant.jpg


----------



## Quest99

Quote:


> Originally Posted by *CallsignVega*
> 
> I wonder if I should get paid for this?


WTH......Speechless..... like WOW. Now I really cannot wait for your videos.









Wonder how working photoshop would be on those?...hehe


----------



## iCrap

WOW That is just sick. Those are all catleaps?
But you know, i have to ask. Where did you get that awesome wallpaper?


----------



## Samurai Batgirl

Vega....


----------



## Nocturin

Quote:


> Originally Posted by *De-Zant*
> 
> No, vega uses samsung s23a750ds and a s23a700d. They are 120hz TN displays.


I stand corrected!


----------



## Swolern

Damn that is tight!!!
Doesnt look like you have much curve on that setup Vega. How far away from the monitors are you going to have to sit to see the entire display?


----------



## cpachris

Quote:


> Originally Posted by *CallsignVega*
> 
> I wonder if I should get paid for this?


Wow. Not sure how I've missed how this was created. But., wow. How many pages do I need to go back to figure out how to do the same thing? This looks so cool!


----------



## nathanak21

Amazing.


----------



## l88bastar




----------



## Swolern

Quote:


> Originally Posted by *cpachris*
> 
> Wow. Not sure how I've missed how this was created. But., wow. How many pages do I need to go back to figure out how to do the same thing? This looks so cool!


All you need is about 7-8 grand $$$$$.
Is that about right Vega?


----------



## Particle

All of that work...and then I see a trackball. Puzzling.


----------



## CallsignVega

Quote:


> Originally Posted by *l88bastar*


What the...
Quote:


> Originally Posted by *Particle*
> 
> All of that work...and then I see a trackball. Puzzling.


Pfft, I've won FPS tournaments with ma thumb track-ball. In Soviet Russia, trackball controls you!


----------



## CallsignVega

Everything is running pretty well, except I have found out that mixed refresh rates is still pretty much a no-go. Vsync doesn't work right with the different refresh rates and there is tons of tearing.

I think 7990's, 7970 X2's or "Eyefinity 6" type card if they ever come out with them are really needed to run the setup properly and get everything to 120 Hz. So I may have to return the Lightnings. Blasted single-link DVI port's!


----------



## zdude

Vega, How much do you spend each year on your computers?


----------



## l88bastar

Quote:


> Originally Posted by *zdude*
> 
> Vega, How much do you spend each year on your computers?


If I had to guess I would say.......Bout Three Fiddy

*Premium*


----------



## Swolern

Quote:


> Originally Posted by *CallsignVega*
> 
> Everything is running pretty well, except I have found out that mixed refresh rates is still pretty much a no-go. Vsync doesn't work right with the different refresh rates and there is tons of tearing.
> I think 7990's, 7970 X2's or "Eyefinity 6" type card if they ever come out with them are really needed to run the setup properly and get everything to 120 Hz. So I may have to return the Lightnings. Blasted single-link DVI port's!


I new it had to be too good to be true with AMD. They have let me down in the past and I never went back. Still it was a valiant effort Vega.

So how many DVI-D ports do you have? Only 4, one on each card? Any display ports inputs on the Samsungs?

I88Bastar what's up with all that spacing in your post? U are a weird one......


----------



## l88bastar

Quote:


> Originally Posted by *Swolern*
> 
> I new it had to be too good to be true with AMD. They have let me down in the past and I never went back. Still it was a valiant effort Vega.
> So how many DVI-D ports do you have? Only 4, one on each card? Any display ports inputs on the Samsungs?
> I88Bastar what's up with all that spacing in your post? U are a weird one......


That there is premium spacing.


----------



## armartins

Quote:


> Originally Posted by *Swolern*
> 
> *I new it had to be too good to be true with AMD.* They have let me down in the past and I never went back. Still it was a valiant effort Vega.
> So how many DVI-D ports do you have? Only 4, one on each card? Any display ports inputs on the Samsungs?
> I88Bastar what's up with all that spacing in your post? U are a weird one......


Please don't take it as personal, and definitely I don't want to derail this awesome thread that actually is my default way to get in OCN. Just remembering a few points.

- Vega is pretty much pushing the technology to it's limits;
- 5 120hz panels are equal to 10 DVI-SL links;
- If vega lock his refresh rating to 60hz I guess he would be perfectly fine;
- With only one lightning he would be able to run those 5 panels @60hz in BF3 at full low settings with only FXAA high and keep playable framerates (while having fun with it);
- I know he would because I run BF3 at 3840x1920 with only one 6970DCII and it plays great at those settings;
- I'm not an AMD fanboy at all IMHO the best vga of all times is the Nvidia 8800GT;
- Remember that in 2009 with AMD you were already enabled to use 3 displays with only one vga, starting with VGAs at the $200 price range (before that only a TH2GO would cost $300 and was much more restricted), so if you have surround gaming with NVIDIA be thankful to AMD;
- In 2010 with the Eyefinity 6 cards you were already able to use up to six monitors with only one card (yes stock traders and other professionals do need this kind of setups);
- In december 2009 I bought an 5970 it was hot and loud as hell in my CM690 but it was already able to equal 2 5870s in clocks, in april 2010 it was already watercooled and running @1000Mhz core clock with 10.4 Catalyst it would net 35K GPU score at vantage a respectable score even today;
- I've switched to a 580DCII monster card, with great performance but again I could not use 3 monitors while one application was running in fullscreen, to make it run properly I'd need another 580 what was totally a waste of money (I was working with Unity3D designing games back then);
- So I've switched to a single 6970DCII and have been happy ever since with all DP connections smoothness, no tearing at all in games while using Eyefinity, Hydravision (if you don't know what it is this is something that alone would make worth the transition from NVIDIA to AMD in terms of desktop productivity with grids+multi desktops);
- So if you're disappointed with the 5 portrait support @120hz from AMD tell me what do you have in this regard from NVIDIA? What about 60hz in 5 portrait? Wait NVIDIA doesn't support 5 monitors, right?
- Don't get me wrong I think that after the 4XX fermi fiasco (besides the 460s), NVIDIA has been releasing great products, the 6xx turbo boost is a nice feature for non SLI use and for those that just buy a card and toss it inside the case;
- I just want people to recognize that AMD has good products in the last generations and have been INNOVATING a lot in the last years. There are buggy drivers, sure there are. But NVIDIA has it's glitches also (like why not support officially PCI-E 3.0 in X79? There is a risk of a great driver being released and no support at all to X79 PCI-E 3.0), I just can't take this "AMD complaining everywhere"
- I know I've derailed a lot lol, and this isn't in any sense personal to you Swolern is just a "wrap up" of things I've been reading a lot all over the place...


----------



## nathanak21

^^ rep+


----------



## CallsignVega

Armartins good cliff notes. I don't fault AMD for me being stuck at 60 Hz on a 120 Hz setup. I knew both the Lightning's ports are single-link. I though I could "trick" the drivers to have four screens run at 120 Hz and the last run at 60 Hz like I have in the past. They must have "fixed" that bug though lol.

I just wish there was more news on the X2/7990 cards. I have manually "adjusted" all of the signal timing's on the five monitors to not run past the 165 MHz "cap" of single-link DVI. That has given me 75 Hz across the Eyefinity setup as a temp solution.


----------



## CallsignVega

I have some great news, using the modified .sys file that ToastyX has made, I can clock a 1080P 120 Hz monitor to over 100 Hz with *single link DVI* connection!







All hope is not lost for the Lightning's...

Now I have my whole 5x1 setup running 100+ Hz (100 Hz now, can squeeze some more Hz out of it with some patience).


----------



## l88bastar

Quote:


> Originally Posted by *armartins*
> 
> Please don't take it as personal, and definitely I don't want to derail this awesome thread that actually is my default way to get in OCN. Just remembering a few points.
> - Vega is pretty much pushing the technology to it's limits;
> - 5 120hz panels are equal to 10 DVI-SL links;
> - If vega lock his refresh rating to 60hz I guess he would be perfectly fine;
> - With only one lightning he would be able to run those 5 panels @60hz in BF3 at full low settings with only FXAA high and keep playable framerates (while having fun with it);
> - I know he would because I run BF3 at 3840x1920 with only one 6970DCII and it plays great at those settings;
> - I'm not an AMD fanboy at all IMHO the best vga of all times is the Nvidia 8800GT;
> - Remember that in 2009 with AMD you were already enabled to use 3 displays with only one vga, starting with VGAs at the $200 price range (before that only a TH2GO would cost $300 and was much more restricted), so if you have surround gaming with NVIDIA be thankful to AMD;
> - In 2010 with the Eyefinity 6 cards you were already able to use up to six monitors with only one card (yes stock traders and other professionals do need this kind of setups);
> - In december 2009 I bought an 5970 it was hot and loud as hell in my CM690 but it was already able to equal 2 5870s in clocks, in april 2010 it was already watercooled and running @1000Mhz core clock with 10.4 Catalyst it would net 35K GPU score at vantage a respectable score even today;
> - I've switched to a 580DCII monster card, with great performance but again I could not use 3 monitors while one application was running in fullscreen, to make it run properly I'd need another 580 what was totally a waste of money (I was working with Unity3D designing games back then);
> - So I've switched to a single 6970DCII and have been happy ever since with all DP connections smoothness, no tearing at all in games while using Eyefinity, Hydravision (if you don't know what it is this is something that alone would make worth the transition from NVIDIA to AMD in terms of desktop productivity with grids+multi desktops);
> - So if you're disappointed with the 5 portrait support @120hz from AMD tell me what do you have in this regard from NVIDIA? What about 60hz in 5 portrait? Wait NVIDIA doesn't support 5 monitors, right?
> - Don't get me wrong I think that after the 4XX fermi fiasco (besides the 460s), NVIDIA has been releasing great products, the 6xx turbo boost is a nice feature for non SLI use and for those that just buy a card and toss it inside the case;
> - I just want people to recognize that AMD has good products in the last generations and have been INNOVATING a lot in the last years. There are buggy drivers, sure there are. But NVIDIA has it's glitches also (like why not support officially PCI-E 3.0 in X79? There is a risk of a great driver being released and no support at all to X79 PCI-E 3.0), I just can't take this "AMD complaining everywhere"
> - I know I've derailed a lot lol, and this isn't in any sense personal to you Swolern is just a "wrap up" of things I've been reading a lot all over the place...


Dont confuse whining customers with customers that do not care. For example, when I eat at a restaurant that is disgusting I simply never go back because there is nothing I can say that will make the terrible restaurant any better, there are plenty of other restaurant that I can dine at instead and ultimately I do not care about a restaurant that does not care about me. However, in the land of GPUs we only have TWO options, so we cannot simply go across the street and get our digital crack elsewhere.

Vega has not tried to do anything against spec...these cards are designed to accommodate 5x1 and the drivers should reflect that with screens that do not tear regardless of refresh rates and inputs. It is sloppy work on AMDs part and there is no way around that. In fact, I would have to say that AMD and its partners have been very lazy in the high end GPU department with regard to connections and there is ZERO reason why any $500+ GPU should have a single link DVI output on it...ZERO...they save themselves what fifty cents by going with a single link instead of a dual link....its unacceptable when we are spending this kind of money and if they are going to get stingy with outputs then at least they can have the decency and do what Nvidia does and let you run displays separately off of multiple GPUs.

As a trifire 7970 owner, with one of my cards being an Asus CUII which has four displayports and a single link DVI just like Vegas lightnings, I find it very frustrating that I am going to have to sell all three of my cards and replace them with 7990s just to support five displays properly when they could have made a SIMPLE design decision which would have eliminated that hassle.


----------



## CallsignVega

Dual link DVI takes two display connection streams and Tahiti only has six streams. So really the only way you can do large Eyefinity is 6x mini-DP, 4x mini-DP + 1x DL-DVI, or 4x mini-DP and 2x SL-DVI connections. Asus and MSI try to please too many people with the 4x mini-DP and 2x SL-DVI cards. I think by far the best solution is 6x mini-DP, and then 4x mini-DP + 1x DL-DVI for proper 5x1. 3x2 Eyefinity setups are pretty lame IMO.

On to the good news. It's a good thing Tahiti has a monster pixel clock as I've found I am able to push some serious bandwidth over single link DVI and my monster 24 gauge DVI cable:










That took me a few hours of tweaking timing's. I am literally within .02 MHz of not being able to run 120 Hz on my 700D. (256 MHz seems to be the absolute pixel clock limit on SL-DVI). It was a real nail biter edging back and forth with getting those timings to work. So now my entire Eyefinity setups is rolling along nicely at a buttery smooth 5400x1920 @ 120 Hz. Look's like the four Lightning's just may be keepers!


----------



## l88bastar

http://www.youtube.com/watch?v=MeFi3SDi_n8


----------



## m3t4lh34d

Quote:


> Originally Posted by *l88bastar*
> 
> http://www.youtube.com/watch?v=MeFi3SDi_n8


??


----------



## Swolern

Wow good work there Vega, I'm amazed!. Looks like you made the impossible possible.


----------



## Swolern

Quote:


> Originally Posted by *m3t4lh34d*
> 
> ??


Celebration!


----------



## carramrod

Vega can you post a pic of the back of the monitors? I'd like to see how they are mounted.

Also, which dealer on ebay did you get your catleaps from?


----------



## axipher

Quote:


> Originally Posted by *CallsignVega*
> 
> Dual link DVI takes two display connection streams and Tahiti only has six streams. So really the only way you can do large Eyefinity is 6x mini-DP, 4x mini-DP + 1x DL-DVI, or 4x mini-DP and 2x SL-DVI connections. Asus and MSI try to please too many people with the 4x mini-DP and 2x SL-DVI cards. I think by far the best solution is 6x mini-DP, and then 4x mini-DP + 1x DL-DVI for proper 5x1. 3x2 Eyefinity setups are pretty lame IMO.
> 
> On to the good news. It's a good thing Tahiti has a monster pixel clock as I've found I am able to push some serious bandwidth over single link DVI and my monster 24 gauge DVI cable:
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> That took me a few hours of tweaking timing's. I am literally within .02 MHz of not being able to run 120 Hz on my 700D. (256 MHz seems to be the absolute pixel clock limit on SL-DVI). It was a real nail biter edging back and forth with getting those timings to work. So now my entire Eyefinity setups is rolling along nicely at a buttery smooth 5400x1920 @ 120 Hz. Look's like the four Lightning's just may be keepers!


Amazing to hear man 

I can't wait to get my hands on a 3+ miniDP 7970. I think that making future cards miniDP only with adapters would not only be beneficial to the enthusiasts who need it, but also in pushing the new technology further in to mainstream and reducing costs for everyone to adopt it.


----------



## 161029

Quote:


> Originally Posted by *CallsignVega*
> 
> I wonder if I should get paid for this?










Following this thread now.


----------



## ganganputput

wow vega. +100 man,,, you really done the impossible. double bow...


----------



## nathanak21

hey vega, have you seen the news on the new catalyst drivers with the 7970 ghz edition? looks like thats going to be the way to go for multi-monitor and high resolution setups.


----------



## CallsignVega

Quote:


> Originally Posted by *ganganputput*
> 
> wow vega. +100 man,,, you really done the impossible. double bow...


Thanks!
Quote:


> Originally Posted by *nathanak21*
> 
> hey vega, have you seen the news on the new catalyst drivers with the 7970 ghz edition? looks like thats going to be the way to go for multi-monitor and high resolution setups.


News? Checking for news now!


----------



## nathanak21

So far its preforming well in the 12.7 beta


----------



## nathanak21

http://www.overclock.net/t/1273746/radeon-7970-will-outperform-gtx-680-with-catalyst-12-7/100_50#post_17616347


----------



## mtbiker033

Quote:


> Originally Posted by *Quest99*
> 
> WTH......Speechless..... like WOW. Now I really cannot wait for your videos.
> 
> 
> 
> 
> 
> 
> 
> 
> Wonder how working photoshop would be on those?...hehe


tell morpheus and neo we said hello!









incredibly awesome!!!!


----------



## CallsignVega

Quote:


> Originally Posted by *nathanak21*
> 
> So far its preforming well in the 12.7 beta


Oh ya, I know about the 12.7 beta being pretty good. Unfortunately I cannot run them as BF3 crashes 100% of the time on my setup with 12.7. Not sure if it's 4-way crossfire, Eyefinity or the combination of them. It works fine with 12.6. I88Bastard also says BF3 crashes 100% of the time on his Eyefinity setup with 12.7.


----------



## l88bastar

Quote:


> Originally Posted by *CallsignVega*
> 
> Oh ya, I know about the 12.7 beta being pretty good. Unfortunately I cannot run them as BF3 crashes 100% of the time on my setup with 12.7. Not sure if it's 4-way crossfire, Eyefinity or the combination of them. It works fine with 12.6. I88Bastard also says BF3 crashes 100% of the time on his Eyefinity setup with 12.7.


Ya 12.7 betas are a no go on my crossfire 7970 / Eyefinty 120hz setup....I have two more 7970s inbound so I can see how Trifire and quadfire pan out.

Alright Vega....we all feel like 8 year olds waiting by the christmas tree screaming "PRESANTS, PRESANTS, PRESANTS," so make with some new 5x1 Videos Dammit!!!! Either provide some 5x1 videos or Im gonna spam your existing youtube vids with "Why You no just go one Biggem FULL HD comments


----------



## nathanak21

Dang. maybe when the actual full driver comes out then amd can finally get back onto their A game


----------



## legoman786

Haha. I showed your 5 monitor setup to my coworker and he said that it screams "19 year old v-card holder." xD


----------



## l88bastar

your coworker sounds like a grade a doosh.


----------



## b0z0




----------



## Quest99

Quote:


> Originally Posted by *l88bastar*
> 
> your coworker sounds like a grade a doosh.


Hahaha....


----------



## legoman786

Quote:


> Originally Posted by *l88bastar*
> 
> your coworker sounds like a grade a doosh.


And, sometimes, he can be.

He, also, does not frequent OCN. It's very hard for him to believe stuffs on the internets, considering, he worked at the AI lab in the UofA.


----------



## CallsignVega

Quote:


> Originally Posted by *legoman786*
> 
> Haha. I showed your 5 monitor setup to my coworker and he said that it screams "19 year old v-card holder." xD


I'm surprised he had enough time to view a photo. I know manning the drive-through window can get quite hectic at this time of day.


----------



## Swolern

Quote:


> Originally Posted by *CallsignVega*
> 
> Oh ya, I know about the 12.7 beta being pretty good. Unfortunately I cannot run them as BF3 crashes 100% of the time on my setup with 12.7. Not sure if it's 4-way crossfire, Eyefinity or the combination of them. It works fine with 12.6. I88Bastard also says BF3 crashes 100% of the time on his Eyefinity setup with 12.7.


What kind of framerate are you getting in BF3 with 12.6 at that massive resolution? And how much do you have to lower the settings to get it?


----------



## Swolern

Quote:


> Originally Posted by *legoman786*
> 
> Haha. I showed your 5 monitor setup to my coworker and he said that it screams "19 year old v-card holder." xD


Man i would give up all the putang i smashed to have that set-up!
Ok not all, maybe half
the crappy half


----------



## Nocturin

Quote:


> Originally Posted by *Swolern*
> 
> Man i would give up all the putang i smashed to have that set-up!
> Ok not all, maybe half
> the crappy half


this.


----------



## Hellish

You should consider even if for temporary use an Eyefinity 4 (2x2) setup. You would essentially be using a 46" QFHD 3840x2160 @120hz monitor as those 6mm bezels may just feel like a giant Crosshair (might be bad and distracting might be good who knows) All it would take is a new 2 x 2 monitor stand. You would lose the wider FOV but at the amount of stretching majority of games at it might be worth it.

I may run some electrical tape 6mm wide across a 46" Bravia I have in my house to see how 6mm cross on a screen of that size feels (sticky part wouldn't touch the screen, and I also would not get the awesomeness of [email protected])


----------



## ganganputput

my friend over here tried it with 4 23" screens, 2x2. it sucked bad, he had to take a long while to focus his vision on left or right in bf3. naturally we look straight ahead, it was tough imagining the line away. i had a hand at his setup tried gaming for 30 mins but it really got on our nerves. maybe if u sit really far away it still can work, but then u lose the immersion.

nice idea though.
i wonder what would happen if we actually removed the black inside frame holding one edge of the lcd together. will the monitor still work?


----------



## Aaron_Henderson

I'll share my lunch money with you Calsign if I can be your friend. Friends invite friends over to play with their toys though, right?


----------



## nathanak21

Quote:


> Originally Posted by *Aaron_Henderson*
> 
> I'll share my lunch money with you Calsign if I can be your friend. Friends invite friends over to play with their toys though, right?


Lol your gonna go from ontario to ft. Knox? Have fun lol


----------



## xxbassplayerxx

Hmm... Vega, I'm in Louisville. Can I come over to play if I bring some LN2?









EDIT: Your location says Ft. Rucker, not Ft. Knox. Dang


----------



## nathanak21

Quote:


> Originally Posted by *xxbassplayerxx*
> 
> Hmm... Vega, I'm in Louisville. Can I come over to play if I bring some LN2?
> 
> 
> 
> 
> 
> 
> 
> 
> EDIT: Your location says Ft. Rucker, not Ft. Knox. Dang


Wow. For some weird reason I could have sworn that was where he was from lol.


----------



## CallsignVega

Haha, I am in the arm-pit of the country, lower Alabamer. Although I am moving to Fort Bragg North Carolina at the end of the year. Then I will start back up the Geo-thermal cooling to get my cool water overclocking going.

As for 2x2 monitor or 3x2, I don't care for any setup that put's a bezel gap right in center vision. I only use 3x1 or 5x1.


----------



## nathanak21

Cool! I live in western nc


----------



## xxbassplayerxx

Quote:


> Originally Posted by *CallsignVega*
> 
> Haha, I am in the arm-pit of the country, lower Alabamer. Although I am moving to Fort Bragg North Carolina at the end of the year. Then I will start back up the Geo-thermal cooling to get my cool water overclocking going.
> As for 2x2 monitor or 3x2, I don't care for any setup that put's a bezel gap right in center vision. I only use 3x1 or 5x1.


Is a 5x3 setup possible? Hello 15 monitors...


----------



## nathanak21

Quote:


> Originally Posted by *xxbassplayerxx*
> 
> Is a 5x3 setup possible? Hello 15 monitors...


Is there anything that could really power that?


----------



## De-Zant

Quote:


> Originally Posted by *xxbassplayerxx*
> 
> Quote:
> 
> 
> 
> Originally Posted by *CallsignVega*
> 
> Haha, I am in the arm-pit of the country, lower Alabamer. Although I am moving to Fort Bragg North Carolina at the end of the year. Then I will start back up the Geo-thermal cooling to get my cool water overclocking going.
> As for 2x2 monitor or 3x2, I don't care for any setup that put's a bezel gap right in center vision. I only use 3x1 or 5x1.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Is a 5x3 setup possible? Hello 15 monitors...
Click to expand...

Yes, with a combination of triplehead2go, softTH, and eyefinity, this guy did it. http://www.youtube.com/watch?v=8WETTt01OrU&feature=plcp

He had a thread on OCN somewhere about all his experiments with multiple monitors but I can't find it.


----------



## Swolern

^ LOL that's insane. What resolution is that?


----------



## kakee

5 x 1080p = 5400x1920 resolution. Love those monitor small bezel.


----------



## l88bastar

VEGA HAVE YOU EVERY HAD A MONITOR PHYSICALLY CHANGE SIZE???

I don't know what happened but two of my eyefinity monitors have shrunk approximately 4 inches! I am running three S27A750D samsung 120hz monitors and woke up this morning to find this.
What would cause this? Is this even possible? My mind is BLOWN!









*THIS IS WHAT THEY USED TO LOOK LIKE!!!*


----------



## nvidiaftw12

Lol. ^


----------



## 161029

Haha. What happened?


----------



## L D4WG

What?


----------



## CallsignVega

lol that would have worked better if the background and lighting was the same.


----------



## CallsignVega

For those interested in what the back of the setup looks like:






I am considering putting on a black felt cloth back to the whole setup once I get everything just right.


----------



## Snipe3000

There's more going on back there than I thought. Did you drill your own mounting holes?


----------



## UNOE

WOW... your back in x79 and that screen. Where is the water block on the MSI x79 board ?


----------



## CallsignVega

Quote:


> Originally Posted by *Snipe3000*
> 
> There's more going on back there than I thought. Did you drill your own mounting holes?


The four 750D monitors have two holes that line up with the VESA brackets. The panels are super light and that's all they need. The 700D (far left in video) has a much more challenging mount setup due to the really short ribbon cable. I used super strong servo tape to mount that bad boy off near the edge to keep the PCB in it's needed position due to that short ribbon cable.
Quote:


> Originally Posted by *UNOE*
> 
> WOW... your back in x79 and that screen. Where is the water block on the MSI x79 board ?


Ya, X79 is where it's at.







I will put the XSPC water blocks on the Big Bang when I put the four EK blocks on the Lightning 7970's. Knock it all out at once.









12.7 is working awesome in 4-way crossfire. Nice and super smooth in BF3.


----------



## Mhill2029

As always....great work Vega!









I'm finally getting my screens soon, but i'm still stuck on what sizes to get. Do you feel Surround on 3x 27" in landscape is too overwhelming or 3x 23" is better? I'll be getting the Samsung S23A750D or S27A750D models.

Do you have a guide to de-bezzling those screens by any chance? As they look absolutely stunning once slimmed down...

Dunno about you guys but 4 Way SLI sucks in BF3 for me, bad frame rates and poor GPU usage. I mean it averages between 70FPS-90FPS on a 64player map like Caspian. That isn't right...i know a single screen will bottleneck such a setup, but even so FPS should be exceedingly higher regardless. With 2x GPU's in SLI it runs like a dream with an average of 80FPS-115FPS on 64Player games. TRI and Quad are where the issues arise...


----------



## catcherintherye

Only thing that's not good about your setup, you need to get a new keyboard.


----------



## CallsignVega

If I were to go 3x1 landscape, I would get the 27". For portrait though the 23". My 3x1 120Hz thread has the guide for disassembling the Samsung's. The 23 and 27" are identical in that regard.

I find the 7970 better than the 680 when it comes to high FPS/high resolutions multi-display setups. I am amazed at how well 4-way crossfire is working in Eyefinity. I should have a video up soon.


----------



## ganganputput

Quote:


> Originally Posted by *CallsignVega*
> 
> If I were to go 3x1 landscape, I would get the 27". For portrait though the 23". My 3x1 120Hz thread has the guide for disassembling the Samsung's. The 23 and 27" are identical in that regard.
> I find the 7970 better than the 680 when it comes to high FPS/high resolutions multi-display setups. I am amazed at how well 4-way crossfire is working in Eyefinity. I should have a video up soon.


ahh man hearing this i am wondering if my 4x 670 4gb purchase was really wasted? i am running high res and multi monitor at 7680x1440. maybe if i pretend i didnt read this, i will forget


----------



## Nocturin

Quote:


> Originally Posted by *CallsignVega*
> 
> If I were to go 3x1 landscape, I would get the 27". For portrait though the 23". My 3x1 120Hz thread has the guide for disassembling the Samsung's. The 23 and 27" are identical in that regard.
> I find the 7970 better than the 680 when it comes to high FPS/high resolutions multi-display setups. I am amazed at how well 4-way crossfire is working in Eyefinity. I should have a video up soon.


you need a post/thread that you can link in your sig that's got links to all the things you've done









then you should call it

"Vega's Overflowing Exceedingly Awesome Bucket of Wonder"


----------



## Mhill2029

Quote:


> Originally Posted by *ganganputput*
> 
> ahh man hearing this i am wondering if my 4x 670 4gb purchase was really wasted? i am running high res and multi monitor at 7680x1440. maybe if i pretend i didnt read this, i will forget


lol don't get me started, what with the whole PCI_E 3.0 debacle i'm now considering selling my GPU's on OCN for the price of GTX670's. From what i've been seeing so far though, it's looks like it's purely driver maturity that is needed. It is still early days in that area.....

How is your average FPS on say Caspian with 64players on a single display out of curiosity? For me it's bad, like 70-90FPS.


----------



## UNOE

Quote:


> Originally Posted by *CallsignVega*
> 
> Ya, X79 is where it's at.
> 
> 
> 
> 
> 
> 
> 
> I will put the XSPC water blocks on the Big Bang when I put the four EK blocks on the Lightning 7970's. Knock it all out at once.
> 
> 
> 
> 
> 
> 
> 
> 
> 12.7 is working awesome in 4-way crossfire. Nice and super smooth in BF3.


I have three 7970's now I'm liking the x79. I have to redue the TIM though its the second time now and its a pain. I haven't had time yet.


----------



## CallsignVega

How does almost 4-way 99% GPU usage sound?









http://www.youtube.com/watch?v=IyQoFPO8RyA&feature=youtu.be


----------



## Snipe3000

That's hot, post more please, MORE!
Speaking of hot, how are you cooling those cards? Is there a waterblock for that PCB?


----------



## nunomoreira10

@CallsignVega,
do you think something like this would be possible?


Or something similar with the same concept to "hide" the black bezel?

Please disregard the bad drawing skill









Nice engineering work there


----------



## Hellish

Quote:


> Originally Posted by *Mhill2029*
> 
> With 2x GPU's in SLI it runs like a dream with an average of 80FPS-115FPS on 64Player games. TRI and Quad are where the issues arise...


Dont take my word for this but just a hunch I have is running the 3930k @ 4.6ghz (as seen in your sig rig) wont even be close to fast enough for for 4way, I have found 4.6ghz to be just barely enough for 2 way, which is what I use, any lower and it feels bad.


----------



## CallsignVega

Quote:


> Originally Posted by *Snipe3000*
> 
> That's hot, post more please, MORE!
> Speaking of hot, how are you cooling those cards? Is there a waterblock for that PCB?


Everything besides the CPU is air cooled ATM. You can see the disparity in GPU temperatures and how hot they are getting sandwiched together. Still waiting on EK to release the Lightning blocks.

Quote:


> Originally Posted by *nunomoreira10*
> 
> @CallsignVega,
> do you think something like this would be possible?
> Please disregard the bad drawing skill
> 
> 
> 
> 
> 
> 
> 
> 
> Nice engineering work there


Hm, interesting. I was thinking about swapping out the black electrical tape for something grey/tan to "blend-in" more with overall scenes. I looked for something like "matte-reflective tape" or near translucent tape but I came up empty. Do you know of any such thing?


----------



## nunomoreira10

Quote:


> Originally Posted by *CallsignVega*
> 
> Hm, interesting. I was thinking about swapping out the black electrical tape for something grey/tan to "blend-in" more with overall scenes. I looked for something like "matte-reflective tape" or near translucent tape but I came up empty. Do you know of any such thing?


Cant find anything like that either.
How about reflective paint or spray, you could Paint the tape itself

edit:
found somethings
http://handystraps.co.uk/Reflective-tape/


----------



## l88bastar

Black bezel gaps are the least distracting...look how annoying my OOOOOLLLLD setup was with silver bezel gaps


----------



## Nocturin

you need some of these!


----------



## nunomoreira10

Vegas,
just did a test using aluminum foil on a paper triangular prism shaped on the middle of my plasma television, see for yourself












Edit:
Reflective tape or paint wont work since it reflects the light to the source, it needs to be something like this or a mirror like matte surface.


----------



## rdrdrdrd

looks almost like a window pain


----------



## jetpak12

Quote:


> Originally Posted by *CallsignVega*
> 
> How does almost 4-way 99% GPU usage sound?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.youtube.com/watch?v=IyQoFPO8RyA&feature=youtu.be


Very nice!









I remember you having trouble with 4x Crossfire scaling in BF3 with two 6990s in the past (or was it 4x 6970s?). Do you think the difference is mainly drivers?


----------



## CallsignVega

Quote:


> Originally Posted by *nunomoreira10*
> 
> Vegas,
> just did a test using aluminum foil on a paper triangular prism shaped on the middle of my plasma television, see for yourself
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Edit:
> Reflective tape or paint wont work since it reflects the light to the source, it needs to be something like this or a mirror like matte surface.


Interesting. I was thinking about something like that to slightly be illuminated from the image on both sides. Anyone know of any long, thin triangle objects no more than 1/4 inch per side? I thought I saw triangle pencils before. So the challenge would be creating this triangle object and wrapping it in something like aluminum foil without it wrinkling/creasing on the two "view-able" sides that partially reflect the light.

If anyone has any ideas I am all ears.


----------



## CallsignVega

Does this look like exactly what I need:

http://shop.balsausa.com/product_p/105.htm

?


----------



## nunomoreira10

Just messed around a bit more with the reflecting prim and find out that the more i decrease the angle of the triangle the brighter it gets.
for my TV 45º was the best (45º center and two 67.5º on the base)
this is when placed vertically (great viewing angles so great light from the sides..
horizontally had to decrease it o 25º for best image, so a very long and narrow triangle.
your setup have tn screens so you may have to mess around to see what is the best angle for you.

Hope you understand what i mean, english is not my primary language.


----------



## l88bastar

I dont know if you all remember the seamless display monitors
http://www.seamlessdisplay.com/products_radius320.htm

But they aren't seamless at all, they actually use three screens pressed close together like vega has done and then they have a sort of fresnel beveled lens between the bezel gaps which bend the light to give the appearance of nearly being seemless. If you watch the vids you can see the line where they join but its very minute. Here is how they explain the tech:
http://www.seamlessdisplay.com/co_technology.htm

Since they patented the technology you can view US patent no 6,927,908 online for free and see what they did to get it all to work for respective scientific research








http://patft.uspto.gov/netacgi/nph-Parser?Sect1=PTO1&Sect2=HITOFF&d=PALL&p=1&u=%2Fnetahtml%2FPTO%2Fsrchnum.htm&r=1&f=G&l=50&s1=6,927,908.PN.&OS=PN/6,927,908&RS=PN/6,927,908

Premium


----------



## Swolern

Quote:


> Originally Posted by *CallsignVega*
> 
> How does almost 4-way 99% GPU usage sound?
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.youtube.com/watch?v=IyQoFPO8RyA&feature=youtu.be


SHOCK & AWE! Great job as always.


----------



## Snipe3000

Whats the measurements of your new top and side bezels?


----------



## CallsignVega

During more testing I've found two of my four Lightnings each have one inoperable display port output and my 3rd 16x PCI-E 3.0 slot has stopped working. Yay! Does New-Egg cross-ship mobo's if you give them your CC info? This maybe the first mobo I've had to return for malfunction in well, ever.


----------



## Nocturin

Quote:


> Originally Posted by *CallsignVega*
> 
> Does this look like exactly what I need:
> http://shop.balsausa.com/product_p/105.htm
> ?


I would look into creating an this out of sanded acrylic with a silver backing (like a mirror, but the sanding makes it matte, maybe a "smoked" acryllic), instead of something solid wrapped with foil. Could be done by a good machinist







, and some silver sided tape!

And good luck with your parts!


----------



## xxbassplayerxx

I've not had Newegg cross-ship. The only company that I know that does it is Asus with their ROG boards. Might be worth a try, though.


----------



## CallsignVega

I used a spare SSD to install a fresh Win real quick to test. Sure enough, 16x slot #3 is deader than a door nail. Tried different BIOS's, flipping the slot kill-switch back and forth, nothing. The three other 16x slots pick up my 7970's fine. Back to Newegg this goes!


----------



## Swolern

Quote:


> Originally Posted by *CallsignVega*
> 
> During more testing I've found two of my four Lightnings each have one inoperable display port output and my 3rd 16x PCI-E 3.0 slot has stopped working. Yay! Does New-Egg cross-ship mobo's if you give them your CC info? This maybe the first mobo I've had to return for malfunction in well, ever.


How did you like the Big Bang compared to your RIVE? I might have a faulty mobo also and need to do a Newegg exchange.


----------



## CallsignVega

The RIVE worked pretty good. The Big Bang had really good components and the BIOS seemed pretty good. But the permanent disappearing PCI-E slots kinda put that board in a negative light for me.







I even removed the 7970's and put 680's in there and I had two dead slots.









On the plus side being back on my Sniper 3 I've noticed Z77 has quite a bit better SATA 3 Raid controller. My Vertex 4's in RAID 0 pull in a 2049 overall AS SSD score. They only managed around 1600 on X79. Almost a 25% speed increase, not too shabby!


----------



## Samurai Batgirl

http://www.legitreviews.com/news/13637/


----------



## Swolern

Quote:


> Originally Posted by *CallsignVega*
> 
> The RIVE worked pretty good. The Big Bang had really good components and the BIOS seemed pretty good. But the permanent disappearing PCI-E slots kinda put that board in a negative light for me.
> 
> 
> 
> 
> 
> 
> 
> I even removed the 7970's and put 680's in there and I had two dead slots.
> 
> 
> 
> 
> 
> 
> 
> 
> On the plus side being back on my Sniper 3 I've noticed Z77 has quite a bit better SATA 3 Raid controller. My Vertex 4's in RAID 0 pull in a 2049 overall AS SSD score. They only managed around 1600 on X79. Almost a 25% speed increase, not too shabby!


My 3rd PCI-E slot just died on my RIVE also. Whats going on with quality control here????


----------



## Swolern

Quote:


> Originally Posted by *Samurai Batgirl*
> 
> http://www.legitreviews.com/news/13637/


Whoa! pci-e 3.0 x16 to all 4 GPUs. Hmmmm. I wonder how it will work with the nvidia and pci-e 3,0???? Anybody know of an expected release date?


----------



## CallsignVega

Quote:


> Originally Posted by *Swolern*
> 
> My 3rd PCI-E slot just died on my RIVE also. Whats going on with quality control here????


For real? Did you mess around with the PCI-E disable switches? I remember once I lost a PCI-E connectivity on the RIVE but got it back. No way I could recover them on the Big Bang XP2.
Quote:


> Originally Posted by *Swolern*
> 
> Whoa! pci-e 3.0 x16 to all 4 GPUs. Hmmmm. I wonder how it will work with the nvidia and pci-e 3,0???? Anybody know of an expected release date?


Ya, I am still interested with this board. Currently running my Sniper 3/3770K setup with the four Lighting's and so far it's running pretty good. Getting max GPU usage doing 3x1 screen testing. Gonna have to try and get 5x1 running this weekend to see if the quad-8X PCI-E 3.0 slots hold up.


----------



## Swolern

Quote:


> Originally Posted by *CallsignVega*
> 
> For real? Did you mess around with the PCI-E disable switches? I remember once I lost a PCI-E connectivity on the RIVE but got it back..


I just made sure they were in the ON position and tried multiple GPUs and different psu cables in 3rd slot. 3rd slot LED would not turn on. Sucks! But i have been having problem with the board starting @ a week ago, I think it just finally gave out. http://www.overclock.net/t/1279558/framerate-gpu-utilization-drops-to-0-intermittent-power-and-flashes-to-usb-connected-devices-k-mouse/40#post_17700193


----------



## ganganputput

Quote:


> Originally Posted by *Swolern*
> 
> Whoa! pci-e 3.0 x16 to all 4 GPUs. Hmmmm. I wonder how it will work with the nvidia and pci-e 3,0???? Anybody know of an expected release date?


finally approaching launch after a long wait since march where they first announced the boards capabilities. well its no shocking news for me, waiting for the local prices to be released, that would tell me whether its affordable enough to upgrade to.


----------



## l88bastar

I was going to go with a similar setup like vega did but I found that I prefered my larger S27a750s over the S23a750s. In the fist photo you can see the size difference between the 23" & 27" monitors as I was swapping out the 23s for 27s.

I applied vegas technique to remove the bezels, however, I stopped short of breaching the actual LCD metal shroud enclosure, so I was not able to get as much bezel reduction as Vega. However, I am sure that you will all agree that it still looks stunning! These monitors are easy to mod and if you can build a PC then you can mod these monitors bezels!

So I am effectively running a 3240x1920 @120hz 47" display and it is AMAZING! Thank you for your hard work and sharing your walk thru Vega, because I would never of had the balls to do this mod if it weren't for your pioneering effort!














































And this is the eye of pain, anger and misery. These Hexanuts were so tight on the 5x1 sand that I almost broke my hands getting them loose. These particular nuts need to be loosened so you can rotate the monitors. I had to apply a heat gun and a gallon of WD40 to get 3 of them loosened


----------



## CallsignVega

^- look's good!

I found out doing some overclock testing that the PLX chip/Sniper 3 doesn't like too high of a base-clock increase. Even as low as 105 MHz would cause one of my 7970's to disappear. I was thinking maybe the same happened to the Big Bang but I never adjusted the base clock on that above it's default 100 MHz.


----------



## ganganputput

Quote:


> Originally Posted by *CallsignVega*
> 
> ^- look's good!
> I found out doing some overclock testing that the PLX chip/Sniper 3 doesn't like too high of a base-clock increase. Even as low as 105 MHz would cause one of my 7970's to disappear. I was thinking maybe the same happened to the Big Bang but I never adjusted the base clock on that above it's default 100 MHz.


You meant overclocking the monitor frequency or the card's base clock? Please forgive my ignorance


----------



## Swolern

Looking good I88bastar.














With those high FPS rates Close Quarters on BF3 shines. 100+ fps and a larger field of view doubled my K/D ratio, its almost unfair.


----------



## l88bastar

I tried out 5x1 with my three 27s in the center and two 23"s on the side. Even with trifire I was still able to eek out some impressive Frame Rates and things played good, but since my fifth monitor was landlocked at 60hz with its HDMI connection it brought the whole setup down. Still, I am in love with the FOV that 5x1 offers because it is the perfect combination of height and width and perfectly fills up your natural line of sight. I am sending all of my 23"s back and gonna buy two more 27"s for the sides so I can do proper 5x1. I took some more photos for those interested, but please bear in mind that I did not spend a whole lot of time lining the displays up and will do a much better job with five 27s


----------



## CallsignVega

The 23" don't look as small as I thought they would against the 27". Although, it's a good thing your desk is deep as you need a good four foot distance for 5x1 27".









Here's Bastard if he sits too close:



I had to turn all of my screens brightness down to 40% when in 5x1. Don't want to get a sun-tan.


----------



## l88bastar

Heres how Vega games
http://lh4.ggpht.com/-7ZfZoK2awTw/T8fgfnLuCCI/AAAAAAAARHI/2VLeNo-OsdU/05091201_thumb.jpg

And heres how I like to roll








http://themonitorpage.com/images/monitor-distance.jpg


----------



## CallsignVega

No, that kid is much too far away.

You need to be able to see each and every one of the 10-million pixels clearly:


----------



## mlb426

Quote:


> Originally Posted by *CallsignVega*
> 
> The four 750D monitors have two holes that line up with the VESA brackets. The panels are super light and that's all they need. The 700D (far left in video) has a much more challenging mount setup due to the really short ribbon cable. I used super strong servo tape to mount that bad boy off near the edge to keep the PCB in it's needed position due to that short ribbon cable.
> Ya, X79 is where it's at.
> 
> 
> 
> 
> 
> 
> 
> I will put the XSPC water blocks on the Big Bang when I put the four EK blocks on the Lightning 7970's. Knock it all out at once.
> 
> 
> 
> 
> 
> 
> 
> 
> 12.7 is working awesome in 4-way crossfire. Nice and super smooth in BF3.


Vega, a walk through on the 700d mount debezel process like you did with the 750d would be much appreciated.


----------



## zdude

Quote:


> Originally Posted by *CallsignVega*
> 
> No, that kid is much too far away.
> 
> You need to be able to see each and every one of the 10-million pixels clearly:


lol just lol


----------



## Nope oO

Yo Vega, curious why you got rid of the Catleaps OCd and the FW900 setups. After all that trash talking on TN panels


----------



## Citra

Quote:


> Originally Posted by *Nope oO*
> 
> Yo Vega, curious why you got rid of the Catleaps OCd and the FW900 setups. After all that trash talking on TN panels


He didn't have enough gpu power for the three catleaps.

Sent from my HTC EVO 3D X515a using Tapatalk 2


----------



## xxbassplayerxx

Quote:


> Originally Posted by *Citra*
> 
> He didn't have enough gpu power for the three catleaps.
> Sent from my HTC EVO 3D X515a using Tapatalk 2


People have been running 3x 2560x1600 since Eyefinity was established. I don't think it was a lack of power, I think it was a lack of driver optimization.


----------



## ScribbyDaGreat

Quote:


> Originally Posted by *xxbassplayerxx*
> 
> People have been running 3x 2560x1600 since Eyefinity was established. I don't think it was a lack of power, I think it was a lack of driver optimization.


Not at 120hz they haven't. He had issues related to drivers, yes, but his biggest problem was his desire to push 5 Cats at 120hz!


----------



## xxbassplayerxx

If you can run 3x 2560x1600, you have enough power to run 3x 2560x1440. Monitor refresh rate doesn't affect video output. He might not have been pushing them at 100+ frames at all times, but I don't think that makes the setup underpowered.


----------



## Citra

He didn't want to just run the games, he wanted them at ultra graphics settings @ 100+fps.


----------



## Samurai Batgirl

Ultra Mega Vega settings*


----------



## derickwm

Quote:


> Originally Posted by *l88bastar*
> 
> Black bezel gaps are the least distracting...look how annoying my OOOOOLLLLD setup was with silver bezel gaps


Quote:


> Originally Posted by *l88bastar*
> 
> I was going to go with a similar setup like vega did but I found that I prefered my larger S27a750s over the S23a750s. In the fist photo you can see the size difference between the 23" & 27" monitors as I was swapping out the 23s for 27s.
> 
> I applied vegas technique to remove the bezels, however, I stopped short of breaching the actual LCD metal shroud enclosure, so I was not able to get as much bezel reduction as Vega. However, I am sure that you will all agree that it still looks stunning! These monitors are easy to mod and if you can build a PC then you can mod these monitors bezels!
> 
> So I am effectively running a 3240x1920 @120hz 47" display and it is AMAZING! Thank you for your hard work and sharing your walk thru Vega, because I would never of had the balls to do this mod if it weren't for your pioneering effort!


What kind of house do you live in?

I'm jelly.


----------



## dmanstasiu

Quote:


> Originally Posted by *derickwm*
> 
> What kind of house do you live in?
> I'm jelly.


Only you would comment on architectural characteristics of a house during a computer build log


----------



## Nope oO

Quote:


> Originally Posted by *Citra*
> 
> He didn't want to just run the games, he wanted them at ultra graphics settings @ 100+fps.


Can you elaborate? The 4x 680s couldn't run the 3 Catleaps at 100hz properly? Or was SLI scaling not good enough?


----------



## l88bastar

Quote:


> Originally Posted by *derickwm*
> 
> What kind of house do you live in?
> I'm jelly.


Ohh I have a cozy little cottage out in the countryside









Where the Hot Sex flows freely









everything is kept neat and tidy









with mind blowing amenities









and an indoor BBQ


----------



## CallsignVega

Quote:


> Originally Posted by *ScribbyDaGreat*
> 
> Not at 120hz they haven't. He had issues related to drivers, yes, but his biggest problem was his desire to push 5 Cats at 120hz!


Yes, there are no 4-way GPU's that could run five 2B Catleaps properly. I really wanted 5x1, so I went back to the Samsung's as they are the best you can get for 120 Hz TN panels and be nice and thin.

Guild Wars 2 last beta starts tomorrow. I will post on how 4-way Cross-fire in 5x1 works out.


----------



## xxbassplayerxx

Quote:


> Originally Posted by *CallsignVega*
> 
> Yes, there are no 4-way GPU's that could run five 2B Catleaps properly. I really wanted 5x1, so I went back to the Samsung's as they are the best you can get for 120 Hz TN panels and be nice and thin.
> Guild Wars 2 last beta starts tomorrow. I will post on how 4-way Cross-fire in 5x1 works out.


As long as the framerate was fluid and playable, I think I'd opt for the higher resolution over the higher framerate that comes with the lower res. If drivers were working a bit better, do you think it would have been powerful enough?

Also, can you see the difference between 60 fps and 100 fps?


----------



## Quest99

Quote:


> Originally Posted by *CallsignVega*
> 
> Yes, there are no 4-way GPU's that could run five 2B Catleaps properly. I really wanted 5x1, so I went back to the Samsung's as they are the best you can get for 120 Hz TN panels and be nice and thin.
> Guild Wars 2 last beta starts tomorrow. I will post on how 4-way Cross-fire in 5x1 works out.


Oh my....please make a video of GW2....my staff will go crazy over your setup rocking GW2!


----------



## CallsignVega

Quote:


> Originally Posted by *xxbassplayerxx*
> 
> As long as the framerate was fluid and playable, I think I'd opt for the higher resolution over the higher framerate that comes with the lower res. If drivers were working a bit better, do you think it would have been powerful enough?
> Also, can you see the difference between 60 fps and 100 fps?


I think it comes down to the GPU's bandwidth. GTX 680's completely choked on three Catleaps. Was a stuttering mess at that high of a resolution. The 384-bit of the 7970's work a lot better, but to get everything 100% perfect hopefully the 89xx series will have 512-bit bandwidth. That will be awesome for the display setups like mine and 4K resolution displays of the future.

There is definitely a point of diminishing return on resolution also. For instance having 3.7 million pixels on each end monitor just for peripheral vision is definitely overkill, especially taking into account how much GPU power it takes to run them. As for the "lower" resolution of the 5x1 TN Panel's, 10-million pixels still looks marvelous. Especially at 120 Hz. The 23" of each 1080P panel also scales really well compared to the 27" 1440P as the size decreases. 5x1 23" is a bit larger than most people realize IMO. That's 58 inches diagonal and with the PPI and clarity to sit two feet from it makes for some awesome gaming.


----------



## l88bastar

Quote:


> Originally Posted by *xxbassplayerxx*
> 
> As long as the framerate was fluid and playable, I think I'd opt for the higher resolution over the higher framerate that comes with the lower res. If drivers were working a bit better, do you think it would have been powerful enough?
> Also, can you see the difference between 60 fps and 100 fps?


I'm sorry, but did you just call 5400x1920 low res? LMAO, this is not 720p 120hz versus 1080p 60hz....Whatsa ten million pixels not enough for ya big spenda









And yes, there are miles of difference between 60hz, 100hz and 120hz.


----------



## De-Zant

He said lower res, not low res. Kind of a big difference there.


----------



## ganganputput

Quote:


> Originally Posted by *CallsignVega*
> 
> I think it comes down to the GPU's bandwidth. GTX 680's completely choked on three Catleaps. Was a stuttering mess at that high of a resolution. The 384-bit of the 7970's work a lot better, but to get everything 100% perfect hopefully the 89xx series will have 512-bit bandwidth. That will be awesome for the display setups like mine and 4K resolution displays of the future.
> There is definitely a point of diminishing return on resolution also. For instance having 3.7 million pixels on each end monitor just for peripheral vision is definitely overkill, especially taking into account how much GPU power it takes to run them. As for the "lower" resolution of the 5x1 TN Panel's, 10-million pixels still looks marvelous. Especially at 120 Hz. The 23" of each 1080P panel also scales really well compared to the 27" 1440P as the size decreases. 5x1 23" is a bit larger than most people realize IMO. That's 58 inches diagonal and with the PPI and clarity to sit two feet from it makes for some awesome gaming.


oh so your latest setup is 5x 23" TN 120hz panels at 1080p, not 5X27" TN panels at 1440p.. I thought you managed to get 18.4 million pixel screens working at 120hz!! would be cool though if thats possible..
so your 5 monitors are samsung SA23750Ds, yeah the original bezel are already quite thin. Hope my SA27850Ds have thin bezels inside them, the outer bezel looks fat...

i wish Nvidia enables their cards to run at least 5 monitors like AMD in future, may consider 5x1 as a long term goal... i dont really wanna go change to amd cards again, after using them two times historically.. card failures, driver issues,high temps and stuff...


----------



## CallsignVega

AMD have come a long way. My four Lightning 7970's are running my 5x1 setup swimmingly and I haven't even overclocked them yet (waiting for EK blocks). Until NVIDIA wake up and allow 5x1, I'll be with AMD for quite a while. Not to mention 79xx series does better at high resolution and refresh rates than the 680 anyway. 680 with it's tiny bandwidth bus.


----------



## CallsignVega

Sweet, Guild Wars 2 is working pretty awesome in 4-way crossfire 5x1! I should have a video up this weekend.


----------



## renat77

Quote:


> Originally Posted by *CallsignVega*
> 
> How does almost 4-way 99% GPU usage sound?
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.youtube.com/watch?v=IyQoFPO8RyA&feature=youtu.be


Congratulations! here it is! I always wanted to see the GPU usage and FPS score. Dream, come true!







I want this score. Of course one day if possible to 1920x1080 resolution.


----------



## Quest99

Quote:


> Originally Posted by *CallsignVega*
> 
> Sweet, Guild Wars 2 is working pretty awesome in 4-way crossfire 5x1! I should have a video up this weekend.


Cannot wait for that video!


----------



## supermi

I just read through the first 50 pages and have a question ...

I am running a surround 3d setup and am choosing some video cards for 4 way sli









I am trying to determine if 2gb will be enough or 4gb be necessary









I was thinking to get 4 680 lightnings and overclock them with the voltage control and another option for a similar price would be the 4gb FTW or maybe the classifieds ...

Any thoughts , seems this is the right thread to ask







!!!

and to be clear as a 3d vision surround we are talking 3 1080p monitors ... my mb is the rampage extreme 4 (I will enable pcie 3) and my 3920 is clocked at 5ghz daily ...

I am considering things like fun of overclocking, importance of memory overclocking and how 2gb video memory will likely clock better than 4gb and total amount of video memory might or might not be necessary in top teir games like crysis 2 and BF3 and skyrim







...


----------



## CallsignVega

Guild Wars 2 runs incredibly well, especially for a Beta!

http://www.youtube.com/watch?v=0nHKqbNX4fs&feature=youtu.be


----------



## Quest99

Awesome video! Very smooth indeed.


----------



## l88bastar

Looks awesome...that type of game is not my cup of tea but Im tempted to buy it just because the developer was thoughtful enough to include solid 5x1 & multi-gpu support.

Im looking forward to GTAV and the new Hitman game...unfortunately I bet neither of those two games will provide any kind of support for multi-gpu / 5x1 :{


----------



## CallsignVega

Quote:


> Originally Posted by *l88bastar*
> 
> Looks awesome...that type of game is not my cup of tea but Im tempted to buy it just because the developer was thoughtful enough to include solid 5x1 & multi-gpu support.
> Im looking forward to GTAV and the new Hitman game...unfortunately I bet neither of those two games will provide any kind of support for multi-gpu / 5x1 :{


Dude get GW2 it's awesome.


----------



## thecrim

Man that really makes me wanna preorder but I cba to get my last 5 points for HoM!

3:17 fail hahaha

Please show use your setup up when the 4 EK blocks arrive later this month!! As always great work dude xD


----------



## CallsignVega

Quote:


> Originally Posted by *thecrim*
> 
> Man that really makes me wanna preorder but I cba to get my last 5 points for HoM!
> 3:17 fail hahaha
> Please show use your setup up when the 4 EK blocks arrive later this month!! As always great work dude xD


Haha at that point I didn't know you could fall down there. I was like oh crap better not die! One good thing about GW2 is even if you don't like to put in a lot of time, you can go right to the PvP arena's and it levels you to max with all PvP gear so you can get right into the action.

I will definitely post some pic's when the setup is complete with the Lightning blocks.


----------



## l88bastar

I am so close, yet so far


----------



## nvidiaftw12

Surprised it doesn't fall on its side. 2 monitors on one side with 1 on the other. Also, how do you get that mic to be loud enough? With mine it's never loud enough even at full volume. Even though it's the exact same mic that other people have. :headscrat


----------



## CallsignVega

The Samsung panels are amazingly light. I think 3-4 of my 23" Samsung panels are the same weight as one Catleap panel.


----------



## nvidiaftw12

Hmm.


----------



## Nope oO

Quote:


> Originally Posted by *CallsignVega*
> 
> I think it comes down to the GPU's bandwidth. GTX 680's completely choked on three Catleaps. Was a stuttering mess at that high of a resolution. The 384-bit of the 7970's work a lot better, but to get everything 100% perfect hopefully the 89xx series will have 512-bit bandwidth. That will be awesome for the display setups like mine and 4K resolution displays of the future.
> There is definitely a point of diminishing return on resolution also. For instance having 3.7 million pixels on each end monitor just for peripheral vision is definitely overkill, especially taking into account how much GPU power it takes to run them. As for the "lower" resolution of the 5x1 TN Panel's, 10-million pixels still looks marvelous. Especially at 120 Hz. The 23" of each 1080P panel also scales really well compared to the 27" 1440P as the size decreases. 5x1 23" is a bit larger than most people realize IMO. That's 58 inches diagonal and with the PPI and clarity to sit two feet from it makes for some awesome gaming.


That's a bit disappointing seeing as the 580s handled your triple 1600P quite nicely. I'm surprised Tahiti is doing so well in 4-way after the last try with Caymans not working at all beyond 3-way. How's the micro-stutter?


----------



## CallsignVega

Quote:


> Originally Posted by *Nope oO*
> 
> That's a bit disappointing seeing as the 580s handled your triple 1600P quite nicely. I'm surprised Tahiti is doing so well in 4-way after the last try with Caymans not working at all beyond 3-way. How's the micro-stutter?


There is none.







Or at least the FPS is so fast it completely smooths them out. GW2 is like glass.


----------



## Nope oO

Quote:


> Originally Posted by *CallsignVega*
> 
> There is none.
> 
> 
> 
> 
> 
> 
> 
> Or at least the FPS is so fast it completely smooths them out. GW2 is like glass.


BF3 is stutter free with Tahiti too? Didn't you have micro-stutter trouble with the dual 6990s which made you go back to 580s?


----------



## l88bastar

Quote:


> Originally Posted by *Nope oO*
> 
> BF3 is stutter free with Tahiti too? Didn't you have micro-stutter trouble with the dual 6990s which made you go back to 580s?


The 7970s are VASTLY BETTER than the 6970s. I am running quadfire 7970s @3240x1920 120hz and there is zero microstutter...but then again as Vega says this may be a function of our FPS being well over 120 most of the time.


----------



## Hellish

So I put my 3 xl2420t's in portrait today and honestly do not know how you guys deal with the color shifting.


----------



## Nope oO

Quote:


> Originally Posted by *Hellish*
> 
> So I put my 3 xl2420t's in portrait today and honestly do not know how you guys deal with the color shifting.


And the horizontal refresh rate too. I think that might of been part of the problem with Vega's triple 1600P portrait eye strain.

Quote:


> Originally Posted by *l88bastar*
> 
> The 7970s are VASTLY BETTER than the 6970s. I am running quadfire 7970s @3240x1920 120hz and there is zero microstutter...but then again as Vega says this may be a function of our FPS being well over 120 most of the time.


Good to know. Guess I'll get another one(or two!) 7950s. I think I'd personally prefer 3 Catleaps in landscape to 5 portrait TN panels.


----------



## CallsignVega

Look ma, I'm famous!

http://twitter.com/GuildWars2/status/227126518244077570


----------



## Nocturin

hah, looking awesome and jelly as always.


----------



## l88bastar

I just got my *VEGA INSPIRED* 5x1 setup....setup


----------



## iLLGT3




----------



## kakee

What stand you use? How big (cm) display area? Would be amazing see that own eye in live action


----------



## l88bastar

Quote:


> Originally Posted by *kakee*
> 
> What stand you use? How big (cm) display area? Would be amazing see that own eye in live action


Im using the same exact WSGF 5x1 edition stand that Vega is using
http://www.wsgf.org/products/wsgf-edition-stand

I am using the same displays as Vega (Four Samsung S27A750s & one S27A950)...only difference is I went with the massive 27" displays and Vega went with the wimpy 23"s lol J/K


----------



## hakkafusion

Quote:


> Originally Posted by *l88bastar*
> 
> Im using the same exact WSGF 5x1 edition stand that Vega is using
> http://www.wsgf.org/products/wsgf-edition-stand
> I am using the same displays as Vega (Four Samsung S27A750s & one S27A950)...only difference is I went with the massive 27" displays and Vega went with the wimpy 23"s lol J/K


I like the thin bezel =)


----------



## FcZenitFan

Hey Vega. So if you've switched back to AMD, they must have gotten their s*t, I mean drivers, fixed? I had originally bought three 7970s (had them until release of GTX680) and they would blue screen half of the games; if I only used 2 of them, then I'd get CRAZY microstutter in most games (although they'd work ok on a single screen), and with a single 7970 I'd have my computer freeze in Skyrim and Serious Sam.

Since I've switched over to NVIDIA everything's silky smooth running at 5760x1080, all games maxed out (except for MSAA in some cases like BF3) with great framarates. I still can't believe that AMD fixed their drivers up. I've tried using 6990+6970 before (until January of this year) and that was not much better then the early 7970 drivers - lots of microstuttering in Eyefinity. Didn't you have 7970s at first, then switch to 680s, and now back to square 1?


----------



## nvidiaftw12

Quote:


> Originally Posted by *FcZenitFan*
> 
> Hey Vega. So if you've switched back to AMD, they must have gotten their s*t, I mean drivers, fixed? I had originally bought three 7970s (had them until release of GTX680) and they would blue screen half of the games; if I only used 2 of them, then I'd get CRAZY microstutter in most games (although they'd work ok on a single screen), and with a single 7970 I'd have my computer freeze in Skyrim and Serious Sam.
> Since I've switched over to NVIDIA everything's silky smooth running at 5760x1080, all games maxed out (except for MSAA in some cases like BF3) with great framarates. I still can't believe that AMD fixed their drivers up. I've tried using 6990+6970 before (until January of this year) and that was not much better then the early 7970 drivers - lots of microstuttering in Eyefinity. Didn't you have 7970s at first, then switch to 680s, and now back to square 1?


You sure sound like a fanboy. Every brand of cards have had problems with their drivers when the cards first came out. Proof? How about those redscreens on the 680's? Amd's drivers are just fine.


----------



## dmanstasiu

Quote:


> Originally Posted by *FcZenitFan*
> 
> Hey Vega. So if you've switched back to AMD, they must have gotten their s*t, I mean drivers, fixed? I had originally bought three 7970s (had them until release of GTX680) and they would blue screen half of the games; if I only used 2 of them, then I'd get CRAZY microstutter in most games (although they'd work ok on a single screen), and with a single 7970 I'd have my computer freeze in Skyrim and Serious Sam.
> Since I've switched over to NVIDIA everything's silky smooth running at 5760x1080, all games maxed out (except for MSAA in some cases like BF3) with great framarates. I still can't believe that AMD fixed their drivers up. I've tried using 6990+6970 before (until January of this year) and that was not much better then the early 7970 drivers - lots of microstuttering in Eyefinity. Didn't you have 7970s at first, then switch to 680s, and now back to square 1?


Quote:


> Originally Posted by *nvidiaftw12*
> 
> You sure sound like a fanboy. Every brand of cards have had problems with their drivers when the cards first came out. Proof? How about those redscreens on the 680's? Amd's drivers are just fine.


What Vega did, or someone else (memory is sloppy), is that they changed monitors.

FcZenitFan, were you using an active adapter?


----------



## Nope oO

Quote:


> Originally Posted by *l88bastar*
> 
> I just got my *VEGA INSPIRED* 5x1 setup....setup












That's it, I'm going to have to join the Vega fanclub too. I've had enough of tunnel vision single screens. If I can see 180 degrees in real life, why not in a video game too?

Bassplayer edit: removed images from quote.


----------



## andrew grp

Quote:


> Originally Posted by *nvidiaftw12*
> 
> You sure sound like a fanboy. Every brand of cards have had problems with their drivers when the cards first came out. Proof? How about those redscreens on the 680's? Amd's drivers are just fine.


Nahh, he just says the truth, Battlefield had a lot of problems with the Ati's drivers with crashes and "Display driver stopped responding and has recovered". Horrible stuff, if Nv hadn't crippled the cards when it comes to compute performance I wouldn't have 7970 (no matter how many problems Nv has with their drivers I can't believe they can come close to Amd's inconsistency).
Quote:


>


Your painting is incomplete, where are the other pieces?


----------



## FcZenitFan

Quote:


> Originally Posted by *nvidiaftw12*
> 
> You sure sound like a fanboy. Every brand of cards have had problems with their drivers when the cards first came out. Proof? How about those redscreens on the 680's? Amd's drivers are just fine.


Since January 2009 until March 2012 I was running AMD cards exclusively. Not a single NVIDIA card in that time frame. It was all fine and dandy when I used single cards (3870, 5870, 6970), but any crossfire solution I've tried from AMD has always given me lots of issues (3870x2, 6970+6990, 7970x3). So just to see if "grass is greener on the other side", I've given NVIDIA a shot. Well, from my standpoint (3-monitor user), it is indeed greener. Yes, there are some things I don't like about NVIDIA's drivers and setup of Surround, but it all comes down to whether I can actually play my games on more than 1 screen without much trouble. With NVIDIA the answer is yet, with AMD the answer is no. Why that is I dunno, but I've spent 6 month debugging my Eyefinity+Crossfire AMD systems, so I feel like I've given them a fair shot.


----------



## terraprime

Quote:


> Originally Posted by *l88bastar*
> 
> I just got my *VEGA INSPIRED* 5x1 setup....setup


OMG!!! Now that is just beautiful site right there.....*drool* Is that with all the bezels removed completely? Makes those look so sexy









And the painting in the back ground just makes it so much better lol.

Bassplayer edit: removed images from quote.


----------



## CallsignVega

Please people, learn how not to quote pictures. Pages of the same pictures over and over gets a bit annoying.
Quote:


> Originally Posted by *FcZenitFan*
> 
> Since January 2009 until March 2012 I was running AMD cards exclusively. Not a single NVIDIA card in that time frame. It was all fine and dandy when I used single cards (3870, 5870, 6970), but any crossfire solution I've tried from AMD has always given me lots of issues (3870x2, 6970+6990, 7970x3). So just to see if "grass is greener on the other side", I've given NVIDIA a shot. Well, from my standpoint (3-monitor user), it is indeed greener. Yes, there are some things I don't like about NVIDIA's drivers and setup of Surround, but it all comes down to whether I can actually play my games on more than 1 screen without much trouble. With NVIDIA the answer is yet, with AMD the answer is no. Why that is I dunno, but I've spent 6 month debugging my Eyefinity+Crossfire AMD systems, so I feel like I've given them a fair shot.


The 69xx series sucked compared to the 79xx IMO. NVIDIA does pretty well, but like you said they are limited to three screens. I just happen to get back into the 79xx series after they released drivers that actually work. The difference between pre-12.6 and post 12.6 is incredible. The point where I can play BF3 and GW2 pretty darn smooth over 5 monitors in 4-way crossfire is nothing short of amazing. I can't wait to see what the 89xx series will be able to do!


----------



## Mals

Quote:


> Originally Posted by *CallsignVega*
> 
> AMD have come a long way. My four Lightning 7970's are running my 5x1 setup swimmingly and I haven't even overclocked them yet (waiting for EK blocks). Until NVIDIA wake up and allow 5x1, I'll be with AMD for quite a while. Not to mention 79xx series does better at high resolution and refresh rates than the 680 anyway. 680 with it's tiny bandwidth bus.


Vega were you running 2GB 680s? I would be intrigued to see the 4GB 680's vs the 3GB 7970s. I think it MUST have been a VRAM issue if it was the 2GB 680s in 4xSLI. That would definitely bottleneck hard. I feel like the GPU power here isn't as big of an issue as VRAM becomes at this resolution, although you certainly need some serious GPU power to handle it.


----------



## xxbassplayerxx

Quote:


> Originally Posted by *CallsignVega*
> 
> Please people, learn how not to quote pictures. Pages of the same pictures over and over gets a bit annoying.
> The 69xx series sucked compared to the 79xx IMO. NVIDIA does pretty well, but like you said they are limited to three screens. I just happen to get back into the 79xx series after they released drivers that actually work. The difference between pre-12.6 and post 12.6 is incredible. The point where I can play BF3 and GW2 pretty darn smooth over 5 monitors in 4-way crossfire is nothing short of amazing. I can't wait to see what the 89xx series will be able to do!


Gotcha covered


----------



## nvidiaftw12

Quote:


> Originally Posted by *FcZenitFan*
> 
> Since January 2009 until March 2012 I was running AMD cards exclusively. Not a single NVIDIA card in that time frame. It was all fine and dandy when I used single cards (3870, 5870, 6970), but any crossfire solution I've tried from AMD has always given me lots of issues (3870x2, 6970+6990, 7970x3). So just to see if "grass is greener on the other side", I've given NVIDIA a shot. Well, from my standpoint (3-monitor user), it is indeed greener. Yes, there are some things I don't like about NVIDIA's drivers and setup of Surround, but it all comes down to whether I can actually play my games on more than 1 screen without much trouble. With NVIDIA the answer is yet, with AMD the answer is no. Why that is I dunno, but I've spent 6 month debugging my Eyefinity+Crossfire AMD systems, so I feel like I've given them a fair shot.


Fair enough.


----------



## Mals

Vega one more question, why didn't you bother trying SLI 690's? 4GB Vram... two cards, less power draw, possibilities? Probably just lack of availability at the time? I'd like to think they could handle this setup.. I'm tempted to hear results of that.


----------



## Wattser93

Quote:


> Originally Posted by *Mals*
> 
> Vega one more question, why didn't you bother trying SLI 690's? 4GB Vram... two cards, less power draw, possibilities? Probably just lack of availability at the time? I'd like to think they could handle this setup.. I'm tempted to hear results of that.


GTX 690s are 2 GTX 680 2GBs in SLI. The 4GB of VRAM comes from each "card" (so to speak) having 2GB of VRAM, but as a whole, you still only have 2GB of VRAM.


----------



## Mals

Quote:


> Originally Posted by *Wattser93*
> 
> GTX 690s are 2 GTX 680 2GBs in SLI. The 4GB of VRAM comes from each "card" (so to speak) having 2GB of VRAM, but as a whole, you still only have 2GB of VRAM.


I am almost 100% positive this is false. If you Quad SLI them (2 690's) you end up with 4GB total of VRAM


----------



## Nope oO

Quote:


> Originally Posted by *Mals*
> 
> I am almost 100% positive this is false. If you Quad SLI them (2 690's) you end up with 4GB total of VRAM


You can only use 2GB per GPU. It's no different than 4 separate 680s with 2GB each.


----------



## hakkafusion

Quote:


> Originally Posted by *Mals*
> 
> I am almost 100% positive this is false. If you Quad SLI them (2 690's) you end up with 4GB total of VRAM


Quote:


> Originally Posted by *Nope oO*
> 
> You can only use 2GB per GPU. It's no different than 4 separate 680s with 2GB each.


I can also 100% 2nd Nope oO. 2x2gb vram != 4gb. you still limited by 2x2gb, means 2gb, not 4gb, no work well for surround!


----------



## Samurai Batgirl

Quote:


> Originally Posted by *Mals*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Wattser93*
> 
> GTX 690s are 2 GTX 680 2GBs in SLI. The 4GB of VRAM comes from each "card" (so to speak) having 2GB of VRAM, but as a whole, you still only have 2GB of VRAM.
> 
> 
> 
> I am almost 100% positive this is false. If you Quad SLI them (2 690's) you end up with 4GB total of VRAM
Click to expand...

Total VRAM, yes. Usable VRAM, no. The GTX 690's have 4GB of VRAM because two × 2GB from the 680's is 4GB. However, for SLI and Crossfire you have to copy all of the information to both sets of VRAM so the usable amount is only equal to what's on/for one GPU.

Also, he didn't use the GTX 690's because Nvidia can only have 3+1 monitors for surround. AMD cards can use up to six.


----------



## Jackeduphard

For a 3 LCD (possible upgrade to a 5 set up in a year or two) would a 670 gtx or a 7970 XF be teh way to go ...


----------



## Samurai Batgirl

If you _know_ you're going to have five monitors, then the 7970 as it's the only one of the two that can even support than many monitors.








I'd still go with the 7970, though. More VRAM if those monitors are high res, and the 7970 can overclock like a beast (to the point of beating out a 680).


----------



## CallsignVega

Sadly, not sure if NVIDIA will ever stray from the 3-screen Surround limit. 5x1 is only going to get better if they come out with 1080P OLED screens with 1-2mm bezels.


----------



## Mals

Quote:


> Originally Posted by *CallsignVega*
> 
> Sadly, not sure if NVIDIA will ever stray from the 3-screen Surround limit. 5x1 is only going to get better if they come out with 1080P OLED screens with 1-2mm bezels.


1080P OLED 120 hz with 1mm bezels...


----------



## b0z0

How am I going to talk my wife into this type of setup lol.


----------



## Quest99

Quote:


> Originally Posted by *b0z0*
> 
> How am I going to talk my wife into this type of setup lol.


alcohol....lots of it.


----------



## Mals

Quote:


> Originally Posted by *b0z0*
> 
> How am I going to talk my wife into this type of setup lol.


you're going to tell her it cost you about $500


----------



## b0z0

Quote:


> Originally Posted by *Mals*
> 
> you're going to tell her it cost you about $500


And a first born? Lol.


----------



## De-Zant

Quote:


> Originally Posted by *Mals*
> 
> Quote:
> 
> 
> 
> Originally Posted by *CallsignVega*
> 
> Sadly, not sure if NVIDIA will ever stray from the 3-screen Surround limit. 5x1 is only going to get better if they come out with 1080P OLED screens with 1-2mm bezels.
> 
> 
> 
> 1080P OLED 120 hz with 1mm bezels...
Click to expand...

You're going to be waiting for that one for a good while. Even more if you plan on getting it for an affordable price.


----------



## Quest99

ohhh Vega.....what do you think about this?

ASUS Readies 27-inch 3D Monitor with 144 Hz Panel

ASUS is working on a new 27-inch desktop monitor that is capable of 144 Hz refresh-rate, for smoother stereoscopic 3D display. The VG278HE from ASUS is based on an older panel technology, but one that is perfected for the high refresh-rates.

http://vr-zone.com/articles/asus-readies-27-inch-3d-monitor-with-144-hz-panel/16836.html


----------



## CallsignVega

Quote:


> Originally Posted by *De-Zant*
> 
> You're going to be waiting for that one for a good while. Even more if you plan on getting it for an affordable price.


I hear ya, they sure are taking their sweet time with OLED.

Quote:


> Originally Posted by *Quest99*
> 
> ohhh Vega.....what do you think about this?
> ASUS Readies 27-inch 3D Monitor with 144 Hz Panel
> ASUS is working on a new 27-inch desktop monitor that is capable of 144 Hz refresh-rate, for smoother stereoscopic 3D display. The VG278HE from ASUS is based on an older panel technology, but one that is perfected for the high refresh-rates.
> http://vr-zone.com/articles/asus-readies-27-inch-3d-monitor-with-144-hz-panel/16836.html


It would be interesting to see how small the interior bezels are on that display. Although it is 144 Hz DVI only so you'd need to get a ton of adapters to run via Displayport in large Eyefinity setups. Also, I think it might be matte anti-glare which = a no-go for me.


----------



## Mals

Quote:


> Originally Posted by *De-Zant*
> 
> You're going to be waiting for that one for a good while. Even more if you plan on getting it for an affordable price.


Sadly yes. I have just been begging for a 1080p IPS 120hz.. and I just dont' think it'll happen under $500 anytime in the next 4 years.


----------



## De-Zant

Quote:


> Originally Posted by *CallsignVega*
> 
> Quote:
> 
> 
> 
> Originally Posted by *De-Zant*
> 
> You're going to be waiting for that one for a good while. Even more if you plan on getting it for an affordable price.
> 
> 
> 
> I hear ya, they sure are taking their sweet time with OLED.
Click to expand...

The nearest thing we have to consumer affordable OLED for PC is the LG 15EL9500. If manufacturers just took that 5-7 inches further with a slightly higher resolution, we'd be getting somewhere. Yet they don't. And that's frustrating. (Of course it's a bit more complicated than that but still)


----------



## l88bastar

Quote:


> Originally Posted by *b0z0*
> 
> How am I going to talk my wife into this type of setup lol.


One monitor at a time...like a frog simmering in a boiling pot of water


----------



## feteru

Wow, I was at microcenter today, and saw a zr30w, and it' HUGE. I can't imagine what these setups are like!


----------



## andrew grp

It's sad that they don't make monitors like some Tv's.

Guys, do you remember these? And that's without the Vega's treatment.


----------



## CallsignVega

Ya, it's hard to find screens that tick all of the boxes. Small bezels, 120 Hz, thin, VESA mount, lightweight, good viewing angles, low pixels response times. TV's like those above usually don't do well on many of those fronts.


----------



## l88bastar

Quote:


> Originally Posted by *andrew grp*
> 
> Nahh, he just says the truth, Battlefield had a lot of problems with the Ati's drivers with crashes and "Display driver stopped responding and has recovered". Horrible stuff, if Nv hadn't crippled the cards when it comes to compute performance I wouldn't have 7970 (no matter how many problems Nv has with their drivers I can't believe they can come close to Amd's inconsistency).
> Your painting is incomplete, where are the other pieces?


I don't need the rest of the painting...haven't you ever heard the term "Less is More?" Now if you will excuse me I am off to play on my MASSIVE 70" 5x1 setup lmao.
Quote:


> Originally Posted by *andrew grp*
> 
> It's sad that they don't make monitors like some Tv's.
> Guys, do you remember these? And that's without the Vega's treatment.


Lol, he is giving up EVERYTHING just to have those small bezels. That monitor is only 720p, 60hz and triple 46" landscape is WAAAAY TOO MASSIVE. Look at how much display area goes to waste on his setup versus a proper 5x1. And no my bezels on the right aren't overly thick its just the angle of the monitors.


----------



## andrew grp

Yeah, this time you have outdone even yourself but don't forget, those screens weren't suppossed to use them as monitors so you can't say they are bad, is that a deck keyboard? Try to convince Vega to get a mechanical or does he want to have a super small keyboard so his monitors to look bigger? (smart guy Vega)

Also, well played there hahaha


Spoiler: Warning: Spoiler!



I don't need the rest of the painting...haven't you ever heard the term "Less is More?" Now if you will excuse me I am off to play on my MASSIVE 70" 5x1 setup lmao.


----------



## CallsignVega

Quote:


> Originally Posted by *andrew grp*
> 
> Try to convince Vega to get a mechanical or does he want to have a super small keyboard so his monitors to look bigger? (smart guy Vega)


LOL. If I wanted larger monitors and associated larger pixels I would have gone with larger monitors. I use a small keyboard in order to keep the G13 game-pad and my trackball about shoulder width apart. It's about ergonomics.


----------



## l88bastar

Quote:


> Originally Posted by *andrew grp*
> 
> Yeah, this time you have outdone even yourself but don't forget, those screens weren't suppossed to use them as monitors so you can't say they are bad, is that a deck keyboard? Try to convince Vega to get a mechanical or does he want to have a super small keyboard so his monitors to look bigger? (smart guy Vega)
> Also, well played there hahaha
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> I don't need the rest of the painting...haven't you ever heard the term "Less is More?" Now if you will excuse me I am off to play on my MASSIVE 70" 5x1 setup lmao.


Yea, I do alot of typing and the DECK is awesome...but not so good for gaming IMHO...but I don't care cause I use a gamepad for my movements and mouse for looking. Vega's setup is like a streamlined BF3 Killing workstation whereas mine is more like the Cadillac of Killing....I gotta go in style even if it means sacrificing a bit of K/D.


----------



## Samurai Batgirl

Is it bad that I'm already excited for next year's computer from Vega?









Nothing wrong with this setup, by the way. It's awesome.


----------



## zdude

I would love to see what vega could do with IVY-E...


----------



## 161029

Quote:


> Originally Posted by *zdude*
> 
> I would love to see what vega could do with IVY-E...


I'd love to see what he can do with Haswell (if it has DDR4) and...would it be Maxwell by then?


----------



## Levesque

Quote:


> Originally Posted by *CallsignVega*
> 
> I use a small keyboard in order to keep the G13 game-pad and my trackball about shoulder width apart. It's about ergonomics.


Never heard about Tenkeyless mechanical keyboards?

I'm using two Leopold Tenkeyless (Blues and Browns) with my G13. Perfectly aligned between both hands. Blues for typing and browns for gaming.


----------



## nathanak21

Seriously vega. With a system like that how could you not get a mechanical keyboard? Just got my mx8100 a couple weeks ago and its AMAZING.


----------



## iCrap

Vega uses the apple keyboard right? Its not actually that bad...


----------



## Mals

Get one of the tenkeyless from elitekb... those things are sweet =P I have the Rosewill blacks... i'd probly suggest browns or clears if you can get em, but those are supposed to be great.


----------



## nathanak21

Quote:


> Originally Posted by *iCrap*
> 
> Vega uses the apple keyboard right? Its not actually that bad...


Yea, probably my favorite non-mech. I got my POS mx brown 8100 for $25 shipped.


----------



## CallsignVega

I guess I've gotten used to chicklet type keyboards. I like the very short throw. Although for gaming I use the G13 which I wish had a bit more positive key action. Maybe someone has made a mechanical mod for the G13? That would be nice. lol


----------



## Snipe3000

If you prefer a custom feel, I recommend the DX1, I've been using it for years.


----------



## feteru

Quote:


> Originally Posted by *CallsignVega*
> 
> I guess I've gotten used to chicklet type keyboards. I like the very short throw. Although for gaming I use the G13 which I wish had a bit more positive key action. Maybe someone has made a mechanical mod for the G13? That would be nice. lol


I think I saw on Geekhack that someone had swapped out all of the rubber domes in a G13 for mechanical keyswitches. They said it was really difficult though.


----------



## Maniak

Yowza. I just checked out some of your youtube vids Vega. Mother of God. I totally forgot about this thread when I first caught sight of it over a few of months ago. I'm glad I saw it again.

On another note, my PC is now watercooled by all the drool I've been spouting non-stop after seeing your setup.

Great stuff man.


----------



## rubixcube101

question... Do you have to have a stand to hold these samsung bad boys up? or is there another way of sitting them on the desk or something? (not sure how they hold cause i dont have one







)


----------



## Brokenstorm

Quote:


> Originally Posted by *rubixcube101*
> 
> question... Do you have to have a stand to hold these samsung bad boys up? or is there another way of sitting them on the desk or something? (not sure how they hold cause i dont have one
> 
> 
> 
> 
> 
> 
> 
> )


http://www.wsgf.org/products/wsgf-edition-stand


----------



## coolhandluke41

lmao


----------



## wongwarren

Quote:


> Originally Posted by *coolhandluke41*
> 
> lmao


HAHAHAHAHAHA


----------



## l88bastar

I watercool everything else, so naturally my dogs were next


----------



## dmanstasiu

Quote:


> Originally Posted by *coolhandluke41*
> 
> lmao


Lesson: Don't feed your dog liquid coolant


----------



## CallsignVega

Quote:


> Originally Posted by *l88bastar*
> 
> I watercool everything else, so naturally my dogs were next


Ya, I saw his dogs do over a hundred laps around his condo before getting tired. They were super cooled! Plus the dogs are very useful when the power goes out and you can't find a flashlight.


----------



## ganganputput

Quote:


> Originally Posted by *Samurai Batgirl*
> 
> If you _know_ you're going to have five monitors, then the 7970 as it's the only one of the two that can even support than many monitors.
> 
> 
> 
> 
> 
> 
> 
> 
> I'd still go with the 7970, though. More VRAM if those monitors are high res, and the 7970 can overclock like a beast (to the point of beating out a 680).


Hmm, I88bastar and Vega, i remember both of you are running the 7970 lightnings in quadfire, and have not watercooled the babies yet.
i wonder of you have some pics of these cards in action? i was reviewing the cards by reading up reviews, and somehow the gpu reactor at the back of the cards and the fan blades (which seem to protrude out of the fan housing ) seem wider then 2 slot.
I'm wondering how you guys made them fit. Do the protruding fan blades and Gpu reactor cause issues in 4 way quad fire? in terms of space clearance, fan blades not hitting the gpu reactor of next card etc?


----------



## CallsignVega

On GPU's two through four you have to remove the reactor until you water cool.


----------



## ganganputput

Quote:


> Originally Posted by *CallsignVega*
> 
> On GPU's two through four you have to remove the reactor until you water cool.


nice! thanks vega for the clarifications!

hmm i just realised something,

Your name Vega. its actually EVGA if you re-spell it. hehe secret operative of evga:















just a passing thought







hehe


----------



## CallsignVega

Hah. Actually the only EVGA part I will have as soon as I can get it is the NEX 1500w power supply. That is pretty sweet. I've never been too impressed with EVGA's motherboards and until NVIDIA can do 5x1 I won't be getting any EVGA GPU's anytime soon to boot.


----------



## pahoran

hey vega when did you switch to the ASRock Extreme 11 x79?! any performance increase? pics?

P0w


----------



## CallsignVega

It's arriving in the mail today with six more Vertex 4's to run 8x V4's in RAID 0. Cannot let that nice LSI RAID chip go unused now can I?









Will post some pic's and info over the next few days.

IB-E doesn't look like till Q3 2013 now so the Extreme11 should be top dog for quite a while.

http://vr-zone.com/articles/ivy-bridge-e-hedt-arrives-in-2013/16886.html#ixzz22J4SVLFG


----------



## Mals

8x Vertex 4's in RAID.. that isn't 7x redundancy is it? it is like 4 drives worth and 4 for redundancy.. right...? (Sorry I never got into RAID)


----------



## Onions

nope

quick breakdown

raid 0: striped data for speed
raid 1: redundant
raid 0+ 1: need 4 drives both speed and redundancy


----------



## CallsignVega

Haha ya I'll have eight drives in RAID 0. So zero redundancy. If one drive goes, all is lost. I love living on the edge!


----------



## nvidiaftw12




----------



## Mals

iiiiiiinteresting. Cool thanks for the breakdown. That actually makes a lot of sense after all this raid talk I see









Soo, "striped for speed"?









So that requires 2 drives? Is it sorta like dual channel ram where you need two identical drives? I ask because I am tempted to buy another SSD sometime in the near future (Steam takes up too much rooooooom on a 128) and I am not sure how I want to go about doing it. My honest opinion is that I might just get a 250~GB drive for steam, keep my 500GB HD for storage, and my 120GB for windows and that would be all I ever need.

@Vega, you wild animal you.


----------



## Mals

"Any drive failure destroys the array, and the likelihood of failure increases with more drives in the array (at a minimum, catastrophic data loss is almost twice as likely compared to single drives without RAID). A single drive failure destroys the entire array because when data is written to a RAID 0 volume, the data is broken into fragments called blocks."

"More drives in the array means higher bandwidth, but greater risk of data loss."

Jesus VEGA you are LOCO!

Vega's obituary will read "he liked to play russian roulette with five bullets in a six shooter, and he only lost once."


----------



## Onions

that is correct for raid you should alway use identical drives im unsure if you have to but they have to be the same size atleast. your best bet is to just get a 256 gb ssd and add it in and use it for steam games


----------



## reflex99

you *can* raid asymetrical drives, but the smaller one will obviously limit the space usable on the smaller drive.

usually for best performance and reliability, most people recommend using identical drives.


----------



## CallsignVega

Ya, I went with eight 128GB Vertex 4's. Wonder if I will be able to max out the LSI chip?


----------



## Mals

Quote:


> Originally Posted by *CallsignVega*
> 
> Ya, I went with eight 128GB Vertex 4's. Wonder if I will be able to max out the LSI chip?


You didnt' consider just going 4x256 for less chance of frying the whole raid?


----------



## stren

Quote:


> Originally Posted by *Mals*
> 
> You didnt' consider just going 4x256 for less chance of frying the whole raid?


SSDs are pretty reliable these days, and it's not like it's critical stuff. Worst case you reinstall the OS and games right? I mean it's not like Vega bought SSDs from a brand with a poor reliability record right?


----------



## Mals

Quote:


> Originally Posted by *stren*
> 
> SSDs are pretty reliable these days, and it's not like it's critical stuff. Worst case you reinstall the OS and games right? I mean it's not like Vega bought SSDs from a brand with a poor reliability record right?


Muahahaha... MUAHAHAHAHAHA....

Seriously tho I have an OCZ Solid 3 and have had zeeeeeero issues with it. OCZ had some of the early SSDs that certainly had issues but it seems like QC has cleared up a lot of that. Not really any reliability complaints lately for that. People were saying MSI mobos were blowing up left and right, I'm pretty sure their reputation (at least on the Intel side) is pretty rock solid now.


----------



## Mhill2029

Quote:


> Originally Posted by *CallsignVega*
> 
> It's arriving in the mail today with six more Vertex 4's to run 8x V4's in RAID 0. Cannot let that nice LSI RAID chip go unused now can I?
> 
> 
> 
> 
> 
> 
> 
> 
> Will post some pic's and info over the next few days.
> IB-E doesn't look like till Q3 2013 now so the Extreme11 should be top dog for quite a while.
> http://vr-zone.com/articles/ivy-bridge-e-hedt-arrives-in-2013/16886.html#ixzz22J4SVLFG


So it's possible to use a dedicated controller with 4x GPU's? Or is this only possible with GPU waterblocks?

Btw: I thought you'd be broke by now







But you do keep on spending like a crazy person.....


----------



## l88bastar

Quote:


> Originally Posted by *Mhill2029*
> 
> So it's possible to use a dedicated controller with 4x GPU's? Or is this only possible with GPU waterblocks?
> Btw: I thought you'd be broke by now
> 
> 
> 
> 
> 
> 
> 
> But you do keep on spending like a crazy person.....


I know right! Since Vega's started this thread it has personally cost me thousands of dollars in upgrades. I now have to work a night job just to pay for the watercooling on my quadfire 7970 lightnings.


----------



## CallsignVega

Sexy!

On another note, haven't had an ssd fail on me ever. I think I will be OK. I got all eight V4's up to firmware 1.5 using the Sniper 3. Of course run into a huge snag.

Keep getting d6 code on boot. "No console output devices found". Can only mean it is not recognizing the video card. I've tried all four 7970 Lightning's in all four 16x slots and no-go. Lucky me, I don't see an easy way to fix this.


----------



## Mals

Quote:


> Originally Posted by *CallsignVega*
> 
> Sexy!
> On another note, haven't had an ssd fail on me ever. I think I will be OK. I got all eight V4's up to firmware 1.5 using the Sniper 3. Of course run into a huge snag.
> Keep getting d6 code on boot. "No console output devices found". Can only mean it is not recognizing the video card. I've tried all four 7970 Lightning's in all four 16x slots and no-go. Lucky me, I don't see an easy way to fix this.


That is certainly bizarre and possibly because of the motherboard being incapable of handling all that? Good thing for that Asrock Extreme 11..


----------



## Samurai Batgirl

VEGA. VEGA. VEGA.

http://www.overclock.net/t/1286519/nixeus-vue-27-s-ips-2560x1440-led-monitor-nx-vue27-usa-brand/0_30

DISPLAY PORT. SIMILAR TO THE CATLEAPS.
Possible PCB switch could mean it could "overclock" like the Korean versions. Maybe OC without PCB switch.
Also, it's using a higher grade panel.
AND!
AMERICA. BLANK YEAH.


----------



## Swolern

Quote:


> Originally Posted by *l88bastar*
> 
> I know right! Since Vega's started this thread it has personally cost me thousands of dollars in upgrades. I now have to work a night job just to pay for the watercooling on my quadfire 7970 lightnings.


Nice legs







I'll throw in a couple bucks for the cause, lol.


----------



## CallsignVega

Well, I swapped out my Team Group 2400mhz RAM and tested with some old ones I had lying around. The board now works! If it doesn't like that RAM why would it give a video card error? Strange.

So I was setting up my LSI 8x RAID 0 and everything went fine. But in the config utility there was nowhere to select the stripe size. Should I assume that LSI only uses a default stripe size?

Oh BTW the LSI controller on boot is pretty slow. Takes 20+ seconds just to load up before the BIOS even posts.


----------



## stren

Quote:


> Originally Posted by *CallsignVega*
> 
> Well, I swapped out my Team Group 2400mhz RAM and tested with some old ones I had lying around. The board now works! If it doesn't like that RAM why would it give a video card error? Strange.
> So I was setting up my LSI 8x RAID 0 and everything went fine. But in the config utility there was nowhere to select the stripe size. Should I assume that LSI only uses a default stripe size?
> Oh BTW the LSI controller on boot is pretty slow. Takes 20+ seconds just to load up before the BIOS even posts.


Not sure about the stripe, but both LSI controller I've used have made the boot annoyingly slow. That's why I don't bother with serious raid anymore


----------



## l88bastar

Quote:


> Originally Posted by *CallsignVega*
> 
> Well, I swapped out my Team Group 2400mhz RAM and tested with some old ones I had lying around. The board now works! If it doesn't like that RAM why would it give a video card error? Strange.
> So I was setting up my LSI 8x RAID 0 and everything went fine. But in the config utility there was nowhere to select the stripe size. Should I assume that LSI only uses a default stripe size?
> Oh BTW the LSI controller on boot is pretty slow. Takes 20+ seconds just to load up before the BIOS even posts.


I am very disappointed in you Vega. Don't you know the proper troubleshooting protocol is to piss an moan about AMD GPUs & drivers being crap and then to lash out at anybody that gives constructive advice. Who do you think your are? What gives you the right to actually SOLVE the issue


----------



## GnarlyCharlie

Quote:


> Originally Posted by *l88bastar*
> 
> I now have to work a night job just to pay for the watercooling on my quadfire 7970 lightnings.


I tried that, made enough to buy a serial card for a 386.


----------



## Snipe3000

Was the Extreme11 released?


----------



## armartins

Well glad the memory change solved the problem... This was clearly sounding like no "space" for all those device's bios to actually load up on the fixed size partition for then.... I don't recall the whole problem but that happened before with people using revodrive 2 and tri quad GPU... as far as I recall there's a 100kb space for loading gpu bios and at that time the revodrive 2 need to load a "fat" bios to that mapped memory and resulted in crashes despite the avaliability of enough lanes on the mobo... (X58 days...).

Also Glad to see my little "contribution" (from your former 120gb Vertex 3) being very well used lol... 8 vertex 4... show us the numbers!

On a side note: When I finally make the move to X79 I'm considering going for stupid big ram amount with 1600Mhz and try to use RAM Drive to "cache" full folders of games like BF3 and WoW... If it works right this should be serious fast... and persistent world games like wow (besides interface/addons) and BF3 where everything is on the server side... it should work fine. I didn't start the research yet but it must be workaround similar to the NTFS Joints between an HD and a SSD allowing the same folder across different drives, eliminating all the hassle ...


----------



## CallsignVega

Quote:


> Originally Posted by *Samurai Batgirl*
> 
> VEGA. VEGA. VEGA.
> http://www.overclock.net/t/1286519/nixeus-vue-27-s-ips-2560x1440-led-monitor-nx-vue27-usa-brand/0_30
> DISPLAY PORT. SIMILAR TO THE CATLEAPS.
> Possible PCB switch could mean it could "overclock" like the Korean versions. Maybe OC without PCB switch.
> Also, it's using a higher grade panel.
> AND!
> AMERICA. BLANK YEAH.


Take a look at the Overlord Tempest's. A quality panels, 120 Hz, display port. No need to switch PCB!








Quote:


> Originally Posted by *armartins*
> 
> Well glad the memory change solved the problem... This was clearly sounding like no "space" for all those device's bios to actually load up on the fixed size partition for then.... I don't recall the whole problem but that happened before with people using revodrive 2 and tri quad GPU... as far as I recall there's a 100kb space for loading gpu bios and at that time the revodrive 2 need to load a "fat" bios to that mapped memory and resulted in crashes despite the avaliability of enough lanes on the mobo... (X58 days...).
> Also Glad to see my little "contribution" (from your former 120gb Vertex 3) being very well used lol... 8 vertex 4... show us the numbers!
> On a side note: When I finally make the move to X79 I'm considering going for stupid big ram amount with 1600Mhz and try to use RAM Drive to "cache" full folders of games like BF3 and WoW... If it works right this should be serious fast... and persistent world games like wow (besides interface/addons) and BF3 where everything is on the server side... it should work fine. I didn't start the research yet but it must be workaround similar to the NTFS Joints between an HD and a SSD allowing the same folder across different drives, eliminating all the hassle ...


Ya, this definitely was a strange issue. Now I am trying to get a RAID issue resolved. So far the LSI controller is not working very fast as advertised. As for the RAM drive maybe someone here has some experience with that. I don't. Let us know how it works out.


----------



## Samurai Batgirl

Oh my gosh Vega...
I love you o.o


----------



## Mals

wwait wait wait. the overlord tempest on 120hz forums? I can't seem to find any concrete info on it... but i was just scanning. From what I can see... American made.. how do we know it is 120hz? sorry i get excited about this stuff







been waiting for 120hz IPS since I was like in my momma's belly.


----------



## CallsignVega

If anyone is bored and wants to read about my trials and tribulations with the Extreme 11 LSI RAID controller:

http://www.overclock.net/t/1289494/lsi-raid-controller-help-2308-chip-on-board-asrock-extreme-11-motherboard-is-slow


----------



## Qezza

Vega it looks like you have fixed the issues with those numbers? Did you purchase the MegaRaid Fastpath software to get it to run as it should or was it a configuration oversight ?


----------



## CallsignVega

Quote:


> Originally Posted by *Qezza*
> 
> Vega it looks like you have fixed the issues with those numbers? Did you purchase the MegaRaid Fastpath software to get it to run as it should or was it a configuration oversight ?


No, those number above are me just screwing around with FancyCache. As noted in my other thread on the LSI controller speed issues, no resolution to the lackluster speed.


----------



## Brokenstorm

Quote:


> Originally Posted by *CallsignVega*
> 
> As for the RAM drive maybe someone here has some experience with that. I don't. Let us know how it works out.


Ram disks are great, i curently run a 20GB one and i just copy the game I want to play to it (all my game are backed up on a 2TB drive) and i enjoy insanely fast loads.


Those are the result i get from 1600MHz Ram on an X58 @3.6GHz, but i have seen some people get double those value on X79.

EDIT: here are two Ramdisk programs I have used.

QSoft RAMDisk: the fastest RamDisk software I'm aware of and has no limit on the size of the RamDisk you can create. It is also free, although it does expire every 3 months requiring you to reinstall it. Very user unfriendly and with very few features. This is the one I use.

Primo Ramdisk: very fast software as well, but costs $49.95 for the professional version which has a maximum RamDisk size of 32GB. Offer a lot of features such as Real-time imaging and loading image on boot. Very user friendly.


----------



## CallsignVega

Quote:


> Originally Posted by *Brokenstorm*
> 
> Ram disks are great, i curently run a 20GB one and i just copy the game I want to play to it (all my game are backed up on a 2TB drive) and i enjoy insanely fast loads.
> Those are the result i get from 1600MHz Ram on an X58 @3.6GHz, but i have seen some people get double those value on X79.


And if the computer crashes, you just lose the game data since the last time the game was loaded into RAM?


----------



## Brokenstorm

Quote:


> Originally Posted by *CallsignVega*
> 
> And if the computer crashes, you just lose the game data since the last time the game was loaded into RAM?


Most games save their save in my document so its not really an issue, but if there is a patch or an update you need to remind yourslef to back it up again.


----------



## 10nisman

Quote:


> Originally Posted by *CallsignVega*
> 
> Take a look at the Overlord Tempest's. A quality panels, 120 Hz, display port. No need to switch PCB!


Unfortunately it seems that the 120hz model is DVI only









http://www.overlordcomputer.com/products.html


----------



## Mals

Quote:


> Originally Posted by *10nisman*
> 
> Unfortunately it seems that the 120hz model is DVI only
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.overlordcomputer.com/products.html


Yeah the DisplayPort is only available on the non OC Model, just spoke with them.

However, they apparently have quite a robust DVI switch or w/e in the PCB which is what allows them to OC so well, so we'll have to deal with DVI-D.

Sadly, these monitors will be limited to 100hz, as 400mhz pixel fill or w/e is capped on SLI setups at 1440p. I am a little annoyed that I can't run 120hz if I decide to upgrade from a 1080p 120hz monitor.

The whole point of going SLI is to achieve 120hz/FPS constantly.. and by getting this next upgrade to 1440p I am knocked back down to 100hz (and honestly 100 frames is still good, much better than 60).

Oh well.

VEGA, you also have a problem: not only are AMD cards having some issues (seems to be hex-value-changing fixable) with "overclocking" these monitors, but you are going to be limited to 100hz as well if you choose to 5x1 them.


----------



## 10nisman

The 400mhz SLI cap will surely be fixed by the next generation of graphics cards (GTX7xx) right?

EDIT: The 5x1 situation could work on a tri/quadfire setup, right?
EDIT 2: I see that Vega's issue is that the MSI Lightning 7970s only have single link DVI ports, however the ASUS 7970 could be configured to have 2 dual link dvi ports, thus a minimum of tri-fire would be necessary, but it would be possible to have a 5x1 120hz eyefinity setup


----------



## l88bastar

asus 7979 has two single link dvis and only 1 of those can be made dual link dvi if you turn one of the display port connections off. I used to own one.


----------



## stren

Quote:


> Originally Posted by *10nisman*
> 
> The 400mhz SLI cap will surely be fixed by the next generation of graphics cards (GTX7xx) right?
> EDIT: The 5x1 situation could work on a tri/quadfire setup, right?
> EDIT 2: I see that Vega's issue is that the MSI Lightning 7970s only have single link DVI ports, however the ASUS 7970 could be configured to have 2 dual link dvi ports, thus a minimum of tri-fire would be necessary, but it would be possible to have a 5x1 120hz eyefinity setup


With eyefinity all monitors have to be connected to the first card. So having 2 dual link DVI's per card doesn't mean you can use 6 monitors across 3 cards. For 5x1 you need 5 ports on the first card capable of 120hz output. No 7970 cards out there do this yet. The msi lightning vega technically only has 4 120hz outputs and the 5th should only run at 60Hz but has been overclocked to run faster. BTW please correct me if I'm wrong here


----------



## CallsignVega

Can you say 3815 MB/sec Seq Read.


----------



## Brokenstorm

Quote:


> Originally Posted by *CallsignVega*
> 
> Can you say 3815 MB/sec Seq Read.


Quite impressive results.









But seriously, you must try to set up a Ramdisk, with your setup you would get incredible numbers.
If you're really worried about your computer crashing, both program I mentioned can be set up to back up their content to a drive every X minutes.

also it turn out the results I gave were from before I overclocked my CPU here are the new ones. (atto overflows when benching a Ramdisk







)


----------



## Swolern

Vega how did you get your LSI controller working properly with your 8x SSDs in RAID0?

@ Brokenstorm
Those are great speeds, but i cant do anything meaningful with 64gb.


----------



## CallsignVega

Quote:


> Originally Posted by *Brokenstorm*
> 
> Quite impressive results.
> 
> 
> 
> 
> 
> 
> 
> 
> But seriously, you must try to set up a Ramdisk, with your setup you would get incredible numbers.
> If you're really worried about your computer crashing, both program I mentioned can be set up to back up their content to a drive every X minutes.
> also it turn out the results I gave were from before I overclocked my CPU here are the new ones. (atto overflows when benching a Ramdisk
> 
> 
> 
> 
> 
> 
> 
> )
> Quote:
> 
> 
> 
> Ya, I may have to look into it. Hve you ever used FancyCache before? Looks interesting.
> Quote:
> 
> 
> 
> Originally Posted by *Swolern*
> 
> Vega how did you get your LSI controller working properly with your 8x SSDs in RAID0?
> @ Brokenstorm
> Those are great speeds, but i cant do anything meaningful with 64gb.
> 
> 
> 
> I wouldn't say working "properly". I am just running it differently. Instead of hardware bootable RAID which doesn't perform very well, I re-flashed the LSI chip to IT mode which work's good in JBOD mode with Win 7 software RAID. It's just I cannot boot/load OS on that volume anymore. It will just be for App's/games. I will need to get another 2 SSD's for the OS to RAID up on the two Intel 6GB ports.
> 
> Click to expand...
Click to expand...


----------



## utnorris

So Vega, did you drop the Gigabyte Sniper 3 for this board? If so, why?


----------



## Brokenstorm

Quote:


> Originally Posted by *CallsignVega*
> 
> Ya, I may have to look into it. Hve you ever used FancyCache before? Looks interesting.


I have not but I'm aware of it. As far I as understand it will perform similarly to a ramdisk under ideal conditions, I'm just not sure how easy those are to reach or how big a cache you would need for the data (from a game for example) to remain in cache until it's needed again. Of course if you enable deferred writing you open the door to data corruption or data loss should the computer crash.

I might have to give it a try but so far it doesn't seem it would give me anything I don't already get from a ramdisk. (beside the hassle of having to copy my games over)


----------



## CallsignVega

Quote:


> Originally Posted by *Brokenstorm*
> 
> I have not but I'm aware of it. As far I as understand it will perform similarly to a ramdisk under ideal conditions, I'm just not sure how easy those are to reach or how big a cache you would need for the data (from a game for example) to remain in cache until it's needed again. Of course if you enable deferred writing you open the door to data corruption or data loss should the computer crash.
> I might have to give it a try but so far it doesn't seem it would give me anything I don't already get from a ramdisk. (beside the hassle of having to copy my games over)


I was reading up on RAMDISKS and I get some skeptical answers if its really good for games. Some say the difference between SSD's and RAMDISKS in games isn;t that much because games have the OS load their important files into RAM anyway. Thoughts? Of course then there are complaints of the time it takes to load the game in and out of the RAMDISk which I understand sometimes isn't insignificant.
Quote:


> Originally Posted by *utnorris*
> 
> So Vega, did you drop the Gigabyte Sniper 3 for this board? If so, why?


I still have the Sniper 3, just playing around with the Extreme 11 to see how it compares. Got side-tracked by trying to make uber SSD RAID lol.


----------



## utnorris

It's a shame the LSI onboard controller doesn't support RAID 5 or 6. Maybe an update will allow it.


----------



## CallsignVega

I don't think so. All the LSI 2308 chip cards support only 0, 1, 10.


----------



## Samurai Batgirl

I'm thinking you haven't, but...
Have you fixed your RAID problem, yet?


----------



## Brokenstorm

Quote:


> Originally Posted by *CallsignVega*
> 
> I was reading up on RAMDISKS and I get some skeptical answers if its really good for games. Some say the difference between SSD's and RAMDISKS in games isn;t that much because games have the OS load their important files into RAM anyway. Thoughts? Of course then there are complaints of the time it takes to load the game in and out of the RAMDISk which I understand sometimes isn't insignificant.


The main purpose of Ramdisks when it come to games is to speed up the loading of new assets and to a lower extend to avoid FPS drops due to fetching data (although SSD do this quite well too). The only time you're likely to notice the difference is when you load a save or a new area or launch the game itself (In other words content that isn't cached). I know that it made Skyrim much more enjoyable for me while a was messing around and loading every 5 minutes. Whether you will notice the difference over your SSD Raid 0 I can't say, but over my Revodrive it felt like night and day as far as load speed were concerned.

As far as the time is takes to copy the game to the Ramdisk, it takes me 1min30s to copy 10GB from a 7200rpm drive so it's not as bad as some people make it sound (of course this drive is otherwise idle, if people are copying from their system drive or while watching a movie from the same drive it will be considerably longer)


----------



## CallsignVega

I set up a 10GB RAM drive and copied Skyrim to it (I think it's around 7GB) from my 8x Vertex 4 mega-LSI RAID and it only took about 5 seconds lol.









So Mega-RAID setup and RAMDISK, the most overkill gaming storage setup known to man haha. Thinking about trading up my 16 GB RAM to 32 or 64 GB. I understand Win 7 home can only handle 16GB. Now is that excluding the RAMDISK? IE: Can the RAMDISK access all of the RAM that Win 7 can't?

I guess I could always upgrade Win 7 but that's wasted money. This build is on a very strict budget.


----------



## TA4K

Quote:


> Originally Posted by *CallsignVega*
> 
> I set up a 10GB RAM drive and copied Skyrim to it (I think it's around 7GB) from my 8x Vertex 4 mega-LSI RAID and it only took about 5 seconds lol.
> 
> 
> 
> 
> 
> 
> 
> 
> So Mega-RAID setup and RAMDISK, the most overkill gaming storage setup known to man haha. Thinking about trading up my 16 GB RAM to 32 or 64 GB. I understand Win 7 home can only handle 16GB. Now is that excluding the RAMDISK? IE: Can the RAMDISK access all of the RAM that Win 7 can't?
> I guess I could always upgrade Win 7 but that's wasted money. *This build is on a very strict budget*.


sounds legit.


----------



## Samurai Batgirl

Quote:


> Originally Posted by *CallsignVega*
> 
> I set up a 10GB RAM drive and copied Skyrim to it (I think it's around 7GB) from my 8x Vertex 4 mega-LSI RAID and it only took about 5 seconds lol.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> So Mega-RAID setup and RAMDISK, the most overkill gaming storage setup known to man haha. Thinking about trading up my 16 GB RAM to 32 or 64 GB. I understand Win 7 home can only handle 16GB. Now is that excluding the RAMDISK? IE: Can the RAMDISK access all of the RAM that Win 7 can't?
> 
> I guess I could always upgrade Win 7 but that's wasted money. This build is on a very strict budget.


Oh please.









Upgrade and then get the Corsair Platinums.


----------



## PTCB

My rig just crapped its pants seeing the amount of awesomeness in your rig. lol

EDIT: BTW, how do you manage your SSDs in RAID0 with no TRIM support? I have two SSDs in RAID0 and would very much like to know how. Cheers.


----------



## stren

Quote:


> Originally Posted by *CallsignVega*
> 
> I set up a 10GB RAM drive and copied Skyrim to it (I think it's around 7GB) from my 8x Vertex 4 mega-LSI RAID and it only took about 5 seconds lol.
> 
> 
> 
> 
> 
> 
> 
> 
> So Mega-RAID setup and RAMDISK, the most overkill gaming storage setup known to man haha. Thinking about trading up my 16 GB RAM to 32 or 64 GB. I understand Win 7 home can only handle 16GB. Now is that excluding the RAMDISK? IE: Can the RAMDISK access all of the RAM that Win 7 can't?
> I guess I could always upgrade Win 7 but that's wasted money. This build is on a very strict budget.


I don't have proof but if the OS is restricting it from being addressed I don't see why a SW ramdisk program would go through the hassle of trying to work around it. It would be way too much hassle Just put on your big boy pants and buy pro








Quote:


> Originally Posted by *PTCB*
> 
> My rig just crapped its pants seeing the amount of awesomeness in your rig. lol
> EDIT: BTW, how do you manage your SSDs in RAID0 with no TRIM support? I have two SSDs in RAID0 and would very much like to know how. Cheers.


You don't, but with enough SSDs a lack of TRIM is not a big deal. With 2 SSDs at some point Intel should release it's TRIM support for it's sata III chipset as it's been rumored for a long time. There are no real RAID cards that support TRIM yet.


----------



## Brokenstorm

Quote:


> Originally Posted by *CallsignVega*
> 
> Can the RAMDISK access all of the RAM that Win 7 can't?


Depends on which one you use.

I seriously doubt that QSoft will as it won't even go over the 3.x GB boundary on 32bits OS claiming that:
"A RAMDisk should not use this so called "unmanaged memory" above that boundery for reasons that are described here http://blogs.technet.com/markrussinovich/archive/2008/07/21/3092070.aspx and here http://www.osronline.com/showthread.cfm?link=178074"

Seems like Primo might be able to do it but you would have to test to make sure, link


----------



## Blizlake

Quote:


> Originally Posted by *CallsignVega*
> 
> So Mega-RAID setup and RAMDISK, the most overkill gaming storage setup known to man haha. Thinking about trading up my 16 GB RAM to 32 or 64 GB. I understand Win 7 home can only handle 16GB. Now is that excluding the RAMDISK? IE: Can the RAMDISK access all of the RAM that Win 7 can't?


At least the program my (cheap) ASRock board came with, Xfast RAM or something, can use all the extra RAM the OS can't. It isn't RAM disk software though but a RAM cache one.
I think that the RAM disk can use the RAM your OS can't, but it could depend on the program used.
Since you don't seem to be on that tight a budget, how about just save yourself from headaches and get a Win7 professional...


----------



## DigitalSavior

Quote:


> Originally Posted by *Blizlake*
> 
> At least the program my (cheap) ASRock board came with, Xfast RAM or something, can use all the extra RAM the OS can't. It isn't RAM disk software though but a RAM cache one.
> I think that the RAM disk can use the RAM your OS can't, but it could depend on the program used.
> Since you don't seem to be on that tight a budget, how about just save yourself from headaches and get a Win7 professional...


I was thinking the same thing. Xfast Ram states it can use all the other RAM not available to windows (under 32bit anyway). So maybe there is some hope for you.


----------



## aamiic

Hey Vega, do you have SC2? I'd love to see what ends up looking like in portrait mode on that kind of setup.


----------



## feteru

Quote:


> Originally Posted by *aamiic*
> 
> Hey Vega, do you have SC2? I'd love to see what ends up looking like in portrait mode on that kind of setup.


Yeah, this would be really neat. I would think that it would be really irritating to practically have to turn your head to see the minimap.


----------



## CallsignVega

Ya, SC2 actually is limited to 16:9 so it only uses the center three screens. Command centers are the size of your fist lol.


----------



## stren

Quote:


> Originally Posted by *Blizlake*
> 
> At least the program my (cheap) ASRock board came with, Xfast RAM or something, can use all the extra RAM the OS can't. It isn't RAM disk software though but a RAM cache one.
> I think that the RAM disk can use the RAM your OS can't, but it could depend on the program used.
> Since you don't seem to be on that tight a budget, how about just save yourself from headaches and get a Win7 professional...


Quote:


> Originally Posted by *DigitalSavior*
> 
> I was thinking the same thing. Xfast Ram states it can use all the other RAM not available to windows (under 32bit anyway). So maybe there is some hope for you.


Aye but those are programs for a specific set of HW. Most ram drive programs are for generic hardware which means they rely on drivers usually provided by the OS. No harm in trying though.


----------



## DigitalSavior

Quote:


> Originally Posted by *stren*
> 
> Aye but those are programs for a specific set of HW. Most ram drive programs are for generic hardware which means they rely on drivers usually provided by the OS. No harm in trying though.


Yea, I realize that program is for Asrock HW but I figured maybe there are other programs that could do the same. I haven't done enough research to say for sure. I've only used trial versions at any rate.


----------



## l88bastar

Apparently a Vega Craze is sweeping the nation and Vegaites have taken to the streets in celebration. I took these pictures earlier today.


----------



## CallsignVega

Quote:


> Originally Posted by *l88bastar*
> 
> Apparently a Vega Craze is sweeping the nation and Vegaites have taken to the streets in celebration. I took these pictures earlier today.












AE11 is handling my 3960X very sweet! Got it up to ~5.1 GHz. It's amazing how much cooler SB-E runs versus IB. My 3960X stays under 65 C in Prime95 with the following settings:










This is a bit nicer overclock than I could get this chip to do on both the RIVE and the Big Bang XP-II. Maybe an ASRock convert.


----------



## nvidiaftw12

Why don't you have the blck at 100, and the multi at 51? I always though it was bad to overclock the blck.


----------



## CallsignVega

Why would it be bad? If your not overclocking the Blck, your not squeezing every bit of performance out.


----------



## l88bastar

People are going Vegatastic over here...Ive never seen anything like it


----------



## TA4K

Quote:


> Originally Posted by *l88bastar*


I'm joining this club


----------



## armartins

Vega one quick question. I know you must plug all monitors from one card but can you plug on any card? I need this info because I'm planning on getting one lightning and one ref 7970 and card order is important because of my outside radiator. In other words you must connect the monitors in the "closest PCI-E to the CPU"?


----------



## CallsignVega

You know, I've never tried to make the "master crossfire" card anything other than card #1. But I'm 99% sure you can. Anyone else try it?


----------



## iCrap

Quote:


> Originally Posted by *CallsignVega*
> 
> You know, I've never tried to make the "master crossfire" card anything other than card #1. But I'm 99% sure you can. Anyone else try it?


Yea i have done it. The one you plug the monitors into is the master card, you plug the monitors into any one of the cards and it becomes the master.


----------



## Snipe3000

hmm, is it possible to have profiles? for example, 3 monitors on one card and 5 on another. And swap between the two setups when needed?


----------



## dmanstasiu

Quote:


> Originally Posted by *Snipe3000*
> 
> hmm, is it possible to have profiles? for example, 3 monitors on one card and 5 on another. And swap between the two setups when needed?


Seems unnecessarily complicated

Why would you run 3 monitors when you could run 5 anyways? just unplug two ...


----------



## iCrap

Quote:


> Originally Posted by *Snipe3000*
> 
> hmm, is it possible to have profiles? for example, 3 monitors on one card and 5 on another. And swap between the two setups when needed?


Pretty sure you can't do that. they all have to be on one card. You could do that if you disable CFX though.


----------



## pandatoucher

Quote:


> Originally Posted by *iCrap*
> 
> Pretty sure you can't do that. they all have to be on one card. You could do that if you disable CFX though.


I think what he means is having an eyefinity profile setup for each card so he would have one card where it would do a 3 monitor setup and when he wanted to swap to his 5 monitor setup he would just move all the plugs to that one and it would be pre-configured to that setup. If I'm not mistaken i thought you could make several eyefinity profiles so you wouldn't need to swap cards if that is the case.


----------



## CallsignVega

With my 5x1 setup I set a hotkey to swap between 5x1 and 3x1 (turning off the two end monitors). It's useful as some games don't work in 5x1.


----------



## Detroitsoldier

Y'know, somehow, my brain-killing indecision on getting another 5850 or getting two 470's doesn't seem like much compared to someone with a 3960X and four 7970's.


----------



## zdude

Hey vega just a quick question will you be picking up the 3970x when it releases this fall?


----------



## CallsignVega

Quote:


> Originally Posted by *Detroitsoldier*
> 
> Y'know, somehow, my brain-killing indecision on getting another 5850 or getting two 470's doesn't seem like much compared to someone with a 3960X and four 7970's.


In the end, it all boils down to the gaming.









Quote:


> Originally Posted by *zdude*
> 
> Hey vega just a quick question will you be picking up the 3970x when it releases this fall?


I guess it depends on how well it overclocks. My 3960X doing 5.1 GHz is doing pretty well.


----------



## Snipe3000

Quote:


> Originally Posted by *dmanstasiu*
> 
> Seems unnecessarily complicated
> Why would you run 3 monitors when you could run 5 anyways? just unplug two ...


I have a racing rig on the other side of the room with dedicated 3 monitors built onto it, having the ability to switch from that to my desktop setup would be sweet.


----------



## dushan24

Quote:


> Originally Posted by *CallsignVega*
> 
> What complete computer gaming system could _literally_ weigh half of a ton (not including the liquid)?
> The build commences...
> 
> 
> 
> 
> 
> 
> 
> 
> +
> 
> 
> 
> 
> 
> 
> 
> 
> +
> 
> 
> 
> 
> 
> 
> 
> 
> +
> 
> 
> 
> 
> 
> 
> 
> 
> +
> 
> 
> 
> 
> 
> 
> 
> 
> +
> 
> 
> 
> 
> 
> 
> 
> 
> = ?
> Setups of past:
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.youtube.com/user/CallsignVega/videos
> 
> http://hardforum.com/showthread.php?t=1588587
> With an almost mathematical certainty, I am building the worlds only 3x FW900 portrait, no bezel gap, seamless Fresnel lens, Ivy Bridge, Quad Kepler, three user select-able cooling loop (Ambient, phase-change, Geothermal), gaming computer.
> Your run of the mill system.


That Owl wallpaper is amazing, as is that setup.


----------



## l88bastar

Running two different eyefinity groups is a nightmare and should be avoided. I did that with my three Dell U2410s and two 1080p projectors on the 5000 & 6000 series cards. The monitors were plugged into one card and the projectors in the other. When you selected crossfire you could select which card you wanted crossfire primary on so it was possible to do crossfire with each eyefinity setup. However, eyefinity and crossfire do not like to play nice with any primary card that is not in slot one, so I would always get crashing on the second eyefinity group.

Now fast forward to today. I have four MSI lightnings that I put under water and unfortunately I discovered AFTER the install that card one is dead (see pic). Because Card one is dead, I ran single card eyefinity no problem off of card 4, but the second I connected the power to cards 3 & 2 it would no longer let me use card 4 as the primary and I had to shift everything to card 2.

I setup trifire with cards 2,3& 4 with 2 being the primary and was able to briefly test out things. Unfortunately, the second I tried to change refresh rates AMDs 12.7 driver went into a repeating crash cycle, which even reboots would not fix. So I reinstalled my OS, reinstalled 12.7 & 12.7 cap 3s and tried again...Once again, the second I tried to change refresh rates on my monitors my system went into an endless display driver crash cycle.

Long story short, multi-gpu eyefinity prefers to be in slot one and will not function properly any other way. If you want to run multiple eyefinity configs I suggest you build a second rig or get used to physically swapping your rig between display setups. Otherwise youll drive yourself crazy.


----------



## feteru

Quote:


> Originally Posted by *l88bastar*
> 
> Running two different eyefinity groups is a nightmare and should be avoided. I did that with my three Dell U2410s and two 1080p projectors on the 5000 & 6000 series cards. The monitors were plugged into one card and the projectors in the other. When you selected crossfire you could select which card you wanted crossfire primary on so it was possible to do crossfire with each eyefinity setup. However, eyefinity and crossfire do not like to play nice with any primary card that is not in slot one, so I would always get crashing on the second eyefinity group.
> Now fast forward to today. I have four MSI lightnings that I put under water and unfortunately I discovered AFTER the install that card one is dead (see pic). Because Card one is dead, I ran single card eyefinity no problem off of card 4, but the second I connected the power to cards 3 & 2 it would no longer let me use card 4 as the primary and I had to shift everything to card 2.
> I setup trifire with cards 2,3& 4 with 2 being the primary and was able to briefly test out things. Unfortunately, the second I tried to change refresh rates AMDs 12.7 driver went into a repeating crash cycle, which even reboots would not fix. So I reinstalled my OS, reinstalled 12.7 & 12.7 cap 3s and tried again...Once again, the second I tried to change refresh rates on my monitors my system went into an endless display driver crash cycle.
> Long story short, multi-gpu eyefinity prefers to be in slot one and will not function properly any other way. If you want to run multiple eyefinity configs I suggest you build a second rig or get used to physically swapping your rig between display setups. Otherwise youll drive yourself crazy.


What radiators are you running with your setup? And are you planning to do/doing some mad cooling like TECs or geothermal, or are you just sticking to "standard" watercooling?
Sucks to hear about the dead card, but your setup is pure awesomeness.


----------



## l88bastar

Quote:


> Originally Posted by *feteru*
> 
> What radiators are you running with your setup? And are you planning to do/doing some mad cooling like TECs or geothermal, or are you just sticking to "standard" watercooling?
> Sucks to hear about the dead card, but your setup is pure awesomeness.


I am running this rad in a push pull 18 fan combo. Its an awesome rad, my GPUs top out at 35c under full load...Vega helped me alot when choosing components so perhaps you should spam his PM inbox








http://www.frozencpu.com/products/12293/ex-rad-196/Watercool_MO-RA3_9_x_120mm_PRO_Extreme_Radiator_-_Black_Powder_Coat_22020.html?tl=g30c95s667

I am quite pleased with the temps from this watercooling setup so I will be sticking with it for a while, but I will be watching vegas geothermal shabang if he ever has the time to get around to it.


----------



## CallsignVega

It's not a time issue. It's an "I will be moving soon issue".







Geothermal loop will definitely be a go when I get setup in a new place.

An update on my build: thinking about going in addition to my 8x Vertex 4 LSI RAID0 for programs and games, having 2x Plextor M5 Pro's on the two Intel 6GB ports for Win 7 Ultimate, and getting 64 GB of 2400 MHZ ram for a massive RAMDisk. But that will be almost $3K just in storage and memory, might be going overboard.


----------



## Citra

Quote:


> Originally Posted by *CallsignVega*
> 
> It's not a time issue. It's an "I will be moving soon issue".
> 
> 
> 
> 
> 
> 
> 
> Geothermal loop will definitely be a go when I get setup in a new place.
> An update on my build: thinking about going in addition to my 8x Vertex 4 LSI RAID0 for programs and games, having 2x Plextor M5 Pro's on the two Intel 6GB ports for Win 7 Ultimate, and getting 64 GB of 2400 MHZ ram for a massive RAMDisk. But that will be almost $3K just in storage and memory, might be going overboard.


For you? Seems just like another average upgrade.


----------



## feteru

Quote:


> Originally Posted by *CallsignVega*
> 
> It's not a time issue. It's an "I will be moving soon issue".
> 
> 
> 
> 
> 
> 
> 
> Geothermal loop will definitely be a go when I get setup in a new place.
> An update on my build: thinking about going in addition to my 8x Vertex 4 LSI RAID0 for programs and games, having 2x Plextor M5 Pro's on the two Intel 6GB ports for Win 7 Ultimate, and getting 64 GB of 2400 MHZ ram for a massive RAMDisk. But that will be almost $3K just in storage and memory, might be going overboard.


Overboard? I didn't think that you even knew that word







. Also, just noticed you have HD 650s, how do you like them? Does the Xonar power them fine?


----------



## Quest99

Quote:


> Originally Posted by *l88bastar*
> 
> I am running this rad in a push pull 18 fan combo. Its an awesome rad, my GPUs top out at 35c under full load...Vega helped me alot when choosing components so perhaps you should spam his PM inbox
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.frozencpu.com/products/12293/ex-rad-196/Watercool_MO-RA3_9_x_120mm_PRO_Extreme_Radiator_-_Black_Powder_Coat_22020.html?tl=g30c95s667
> I am quite pleased with the temps from this watercooling setup so I will be sticking with it for a while, but I will be watching vegas geothermal shabang if he ever has the time to get around to it.


What are the pros and cons for a Mora 3 vs 3 seperate rad ?

Do you actually get better temps with the Mora 3? I tried looking on the net but havent seen any answers.

The pros are imo :

Less restriction and cheaper cost


----------



## stren

Quote:


> Originally Posted by *Quest99*
> 
> What are the pros and cons for a Mora 3 vs 3 seperate rad ?
> Do you actually get better temps with the Mora 3? I tried looking on the net but havent seen any answers.
> The pros are imo :
> Less restriction and cheaper cost


Mora's are optimized for low speed or even passive airflow. As soon as you have decent rpms you'd be better off with other radiators (assuming the same area). They are convenient if you have the space. I use a mora 140.9 with 700rpm fans (for my CPU only) on my workstation and the temps are ridiculously good and almost silent. I could run it passively and still be fine. For the gaming side I use 2x560's and a 360 all with higher speed gentle typhoons. I think it was (maybe) skinnee's review that showed a gtx560 beating a mora at 1500rpm or something similar. If I had more time today I'd dig out the link for you.


----------



## Quest99

Quote:


> Originally Posted by *stren*
> 
> Mora's are optimized for low speed or even passive airflow. As soon as you have decent rpms you'd be better off with other radiators (assuming the same area). They are convenient if you have the space. I use a mora 140.9 with 700rpm fans (for my CPU only) on my workstation and the temps are ridiculously good and almost silent. I could run it passively and still be fine. For the gaming side I use 2x560's and a 360 all with higher speed gentle typhoons. I think it was (maybe) skinnee's review that showed a gtx560 beating a mora at 1500rpm or something similar. If I had more time today I'd dig out the link for you.


So these are great for "silent" pc. Interesting....Might slap 2 of these for my custom case along with undervolted fans instead of 5x SR-1s.

Cheers!


----------



## CallsignVega

Quote:


> Originally Posted by *feteru*
> 
> Overboard? I didn't think that you even knew that word
> 
> 
> 
> 
> 
> 
> 
> . Also, just noticed you have HD 650s, how do you like them? Does the Xonar power them fine?


650's are great. Sound is absolutely wonderful in conjunction with the Xonar Essence One. The Xonar is very powerful, I barely have to go above 1/3 volume knob before the 650's get too loud. The 650's are great top of the line headphone for a "reasonable price". To get any sort of substantial upgrade you have to spend $1000+ for headphones.


----------



## feteru

Quote:


> Originally Posted by *CallsignVega*
> 
> 650's are great. Sound is absolutely wonderful in conjunction with the Xonar Essence One. The Xonar is very powerful, I barely have to go above 1/3 volume knob before the 650's get too loud. The 650's are great top of the line headphone for a "reasonable price". To get any sort of substantial upgrade you have to spend $1000+ for headphones.


Yeah, I have been thinking of HD 600s plus a RSA P-51 Mustang with a 5.5g iPod classic for an around the house setup, seeing how much I love my SR80i's, I think I would really like the sound of the 600's, although I haven't ruled out the 650's just yet.


----------



## Nope oO

Quote:


> Originally Posted by *CallsignVega*
> 
> Take a look at the Overlord Tempest's. A quality panels, 120 Hz, display port. No need to switch PCB!


Nice. if you were to use 3 1440p monitors, would you get these? Are they able to be de-bezeled fairly thin?


----------



## CallsignVega

My 8x Vertex-4 RAID 0 gaming array using the integrated LSI 2308 host bus controller on the ASRock Extreme 11:



















Games load pretty fast.


----------



## DigitalSavior

Looks good, I'm assuming you got your raid issues taken care of. Those numbers are awesome. And Vega, we need more videos of your awesome setup!

Cheers


----------



## CallsignVega

I should have a new video up sometime this week. I plan to get everything running 100% before Guild Wars 2 launches on Friday.


----------



## AikenDrum

Hey Vega, what was your caching config for the storage benchmarks above? Write cache, write flushing, fancycache, etc.


----------



## feteru

Vega, what do you think about the rumor that the launch reference 7990s are only going to have 4 miniDP and 2 DL-DVI? Would this not mean that it is still impossible to do 5x1 portrait eyefinity with 1440p+ monitors (regardless of hardware restrictions)?


----------



## CallsignVega

Quote:


> Originally Posted by *AikenDrum*
> 
> Hey Vega, what was your caching config for the storage benchmarks above? Write cache, write flushing, fancycache, etc.


Heya, with the LSI 2308 you can enable none of the above.
Quote:


> Originally Posted by *feteru*
> 
> Vega, what do you think about the rumor that the launch reference 7990s are only going to have 4 miniDP and 2 DL-DVI? Would this not mean that it is still impossible to do 5x1 portrait eyefinity with 1440p+ monitors (regardless of hardware restrictions)?


Would be a silly move IMO. I haven't crunched the numbers, but if you overclock the single link DVI port like I do, you may be able to run the SL-DVI port around 70 Hz on a catleap (I get 120Hz at 1080P). That would work for regular Catleaps but wouldn't be much help for the overclock versions.


----------



## Bigbrag

I see you guy's talking about ram disks. I run Primo for my ram disk on my sr-2. I have a 40gb ramdisk under windows 7 with no problems. It makes every thing fast as hell. When transferring files from my ssd to the ramdisk, I get speeds that aren't even possible for my ssd, like 800mbs. And this is running over sata 2, not sure how that works. It usually takes me about 20 seconds or less to transfer games into the ram disk. I don't use it as much as I thought I would though, and its annoying because I reboot my computer a lot and have to re-add my games to the ram disk. Its a blast to play around with though. Especially for Vm's.


----------



## CallsignVega

Isn't there a way to automatically get the games to add automatically on start up to the RAMDisk?


----------



## Brokenstorm

Quote:


> Originally Posted by *CallsignVega*
> 
> Isn't there a way to automatically get the games to add automatically on start up to the RAMDisk?


Both QSoft and Primo can do that, but it may lengthen your boot time a lot depending on the size of the image you are loading and the speed of hard drive it's stored on.
Primo also has a delayed load option that should alleviate that.


----------



## CallsignVega

Quote:


> Originally Posted by *Brokenstorm*
> 
> Both QSoft and Primo can do that, but it may lengthen your boot time a lot depending on the size of the image you are loading and the speed of hard drive it's stored on.
> Primo also has a delayed load option that should alleviate that.


Cool sounds like Primo is the one to get. I have 64 GB of 2400MHz G.Skill arriving tomorrow. Should running pretty fast in quad-channel RAM-Disk.

Got my Plextor M3Pro OS drive working properly. I used the latest 11.5 and it said "this driver isn't guaranteed to work blah blah" and I forced it to install. Now the numbers are great:










No one should be using the C600 X79 Intel drivers those things are horrible.

Even though the controller is unrelated, broke a 4k score on my 8x Vertex 4 array:










Games on that array should load pretty quick onto the RAMDisk.


----------



## Nope oO

What's the load time difference for say, BF3 with the 8x V4 RAID 0 compared to your RAMDisk?


----------



## CallsignVega

Quote:


> Originally Posted by *Nope oO*
> 
> What's the load time difference for say, BF3 with the 8x V4 RAID 0 compared to your RAMDisk?


I'm going to do some tests to find out precisely that.

Also going to do some Quad 16x 3.0 versus Quad 16x 2.0 (which equals quad 8x 3.0 really) tests on my 5x1 display setup to see if there is any difference.


----------



## Quest99

I shot these F18 making a low pass from my roof during this weekend and It made me wonder if Vega pilots them?


----------



## CallsignVega

Quote:


> Originally Posted by *Quest99*
> 
> I shot these F18 making a low pass from my roof during this weekend and It made me wonder if Vega pilots them?


Sweet, nice F-18's. Those are for guys that like to be on ships. I can't bring a mega gaming computer on an aircraft carrier, so I don't fly those.









Good news is that 12.8 drivers are working really well. Bad news is my 1500w max un-interruptable power supply is screeching at me like a banshee that I'm overloading it. I may have to find a more efficient PSU!


----------



## TA4K

Quote:


> Originally Posted by *Quest99*
> 
> I shot these F18 making a low pass from my roof during this weekend and It made me wonder if Vega pilots them?


Nah, he hasn't set the jets up with a subzero loop, and there aren't any LED's on it. Can't be Vega.


----------



## slickatel

Very nice setup you got there, the youtube gw2 movie brought me here;-)

I'm planning a 3x1 setup myself but it's so hard to choose between 2x 7970 and 2x gtx 670. I don't have any ati experience whatsoever so I would say I'd go for the 670, but then again, thinking in a 3x1 perspective the ati might perform better.

I'm not much of a guru so chances are I stick with the same setup for a few years, making the decision a little harder..

Have u experienced any gw2 driver related issues? Knowing its still BWE's and stresstests. Cause I'll be playing gw2 and bf3 most of the time so that would be my main concern atm.


----------



## feteru

Quote:


> Originally Posted by *slickatel*
> 
> Very nice setup you got there, the youtube gw2 movie brought me here;-)
> I'm planning a 3x1 setup myself but it's so hard to choose between 2x 7970 and 2x gtx 670. I don't have any ati experience whatsoever so I would say I'd go for the 670, but then again, thinking in a 3x1 perspective the ati might perform better.
> I'm not much of a guru so chances are I stick with the same setup for a few years, making the decision a little harder..
> Have u experienced any gw2 driver related issues? Knowing its still BWE's and stresstests. Cause I'll be playing gw2 and bf3 most of the time so that would be my main concern atm.


If you're going over 1 1080p screen, I would personally go AMD, simply for the higher memory bandwidth and more VRAM. Also, if you OC the 7970, then it will stomp the 670.


----------



## CallsignVega

Heya, glad to bring you to OC.net. That's pretty much all I play myself, GW2 and BF3. So far the 7970 Lightnings in 4-way have worked superbly in both of them. For multi-monitor I prefer the 7970 over the 680. As long as they don't screw anything up between the last BWE3 in which crossfire worked great and release on Friday night (headstart), Xfire should be great. I'll make a post Friday night after the head start opens up and comment about crossfire if you are holding off on the GPU purchase for a few days.


----------



## 161029

Quote:


> Originally Posted by *slickatel*
> 
> Very nice setup you got there, the youtube gw2 movie brought me here;-)
> I'm planning a 3x1 setup myself but it's so hard to choose between 2x 7970 and 2x gtx 670. I don't have any ati experience whatsoever so I would say I'd go for the 670, but then again, thinking in a 3x1 perspective the ati might perform better.
> I'm not much of a guru so chances are I stick with the same setup for a few years, making the decision a little harder..
> Have u experienced any gw2 driver related issues? Knowing its still BWE's and stresstests. Cause I'll be playing gw2 and bf3 most of the time so that would be my main concern atm.


Welcome! These sort of things will always bring you over.


----------



## slickatel

Quote:


> Originally Posted by *CallsignVega*
> 
> Heya, glad to bring you to OC.net. That's pretty much all I play myself, GW2 and BF3. So far the 7970 Lightnings in 4-way have worked superbly in both of them. For multi-monitor I prefer the 7970 over the 680. As long as they don't screw anything up between the last BWE3 in which crossfire worked great and release on Friday night (headstart), Xfire should be great. I'll make a post Friday night after the head start opens up and comment about crossfire if you are holding off on the GPU purchase for a few days.


I can wait before making my final decision







I got the headstart too so I'm looking forward to see your post, but I won't blame you if you don't find the time to do so







I'm pretty sure GW2 developers will do all they can to make everything run smoothly.

Thanks for welcoming me guys, and giving me good advice already. Appreciate.


----------



## hakkafusion

Quote:


> Originally Posted by *CallsignVega*
> 
> Isn't there a way to automatically get the games to add automatically on start up to the RAMDisk?


Quote:


> Originally Posted by *Bigbrag*
> 
> I don't use it as much as I thought I would though, and its annoying because I reboot my computer a lot and have to re-add my games to the ram disk. Its a blast to play around with though. Especially for Vm's.


create a .bat and include in start up folder, either hobocopy or use cygwin and rsync it. plenty of options without having to rely on the ramdrive program.


----------



## CallsignVega

I guess I will see if GW2 crossfire is still working properly tomorrow as there is another stress test.


----------



## CallsignVega

I just spent the last four hours trying to figure out why Guild wars 2 keeps crashing. I get a "display driver has crashed" followed by a BSOD - "a clock interrupt was not received on a secondary processor within the allocated time interval". No matter what I do, stock clocks on everything it crashes 100% of the time. BF3 works fine though, arg! And GW2 is the main reason I built this PC. I give up LOL.


----------



## thecrim

NO Vega!

Stress test got extended 1 hour you still have time to find out whats wrong.

You using 12.8?


----------



## Mals

Clean windows install or bust.


----------



## CallsignVega

Little video.

http://www.youtube.com/watch?v=4jhrP7C3Cdg


----------



## armartins

Show me the numbers! How high those babies go? I'm with three here from a friend for some testing this week, no waterblocks though =/


----------



## Quest99

Is everything stable now for GW2?


----------



## Descadent

vega I just wanted to tell you, you have cost me alot of money.

and my wife doesn't like you.


----------



## l88bastar

Quote:


> Originally Posted by *Descadent*
> 
> vega I just wanted to tell you, you have cost me alot of money.
> and my wife doesn't like you.


Lol me too! Mine added a 20% 6pm.com shoe surtax on all my hardware purchases lol.

So I had a crazy moment, where I became dillusional with the thought that 3x1 landscape served me better than 5x1 portrait so I broke my 5x1 setup down and gave 3x1 a go again...Man oohhh man I was soo wrong. 5x1 blows 3x1 away!! I took some comparison pics of 3x1 landscape versus 5x1 portrait at similar angles. Ya the 3x1 fits more of the "game world" in the frame but its horizonal resolution is so low its way to compressed and hard to see detail.


----------



## armartins

The only thing I could possible think off as an upgrade now is a MB waterblock... something regarding cooling performance.... and crazy CPU and GPU sellection for golden chips/cores ... Besides what you've said about your PSU.... that is indeed the climax of current hardware. Congratulations









On a side note: Now what?


----------



## CallsignVega

Quote:


> Originally Posted by *armartins*
> 
> Show me the numbers! How high those babies go? I'm with three here from a friend for some testing this week, no waterblocks though =/


I can run single display benchmarks at 1320/1850 all day long. The highest speeds I can play BF3 100% stable on the 5x1 setup is 1280/1780.
Quote:


> Originally Posted by *Quest99*
> 
> Is everything stable now for GW2?


No, even with the extended downtime to work on things GW2 still kept crashing. I think during my fail testing of 8x8GB (64GB) $1000 RAM my windows got borked. Unless Arenanet changed something that made crossfire unstable which is a good possibility. BTW zooming into the full screen map still breaks crossfire. You have to alt-TAB out and back in again to get it working. Suffice to say building this computer for GW2 and then it not working properly before launch do due one of a millions variables is quite annoying to say the least.
Quote:


> Originally Posted by *Descadent*
> 
> vega I just wanted to tell you, you have cost me alot of money.
> and my wife doesn't like you.


Glad I could provide a service.








Quote:


> Originally Posted by *l88bastar*
> 
> Lol me too! Mine added a 20% 6pm.com shoe surtax on all my hardware purchases lol.
> So I had a crazy moment, where I became dillusional with the thought that 3x1 landscape served me better than 5x1 portrait so I broke my 5x1 setup down and gave 3x1 a go again...Man oohhh man I was soo wrong. 5x1 blows 3x1 away!! I took some comparison pics of 3x1 landscape versus 5x1 portrait at similar angles. Ya the 3x1 fits more of the "game world" in the frame but its horizonal resolution is so low its way to compressed and hard to see detail.


Ya, those 5x1 portrait images look only a million times better than 3x1 landscape. Never cared for the latter.
Quote:


> Originally Posted by *armartins*
> 
> The only thing I could possible think off as an upgrade now is a MB waterblock... something regarding cooling performance.... and crazy CPU and GPU sellection for golden chips/cores ... Besides what you've said about your PSU.... that is indeed the climax of current hardware. Congratulations
> 
> 
> 
> 
> 
> 
> 
> 
> On a side note: Now what?


Ya, I would have gone with a MB water-block if someone were to make one.







As for the future, 89xx I guess?


----------



## Descadent

Quote:


> Originally Posted by *l88bastar*
> 
> Lol me too! Mine added a 20% 6pm.com shoe surtax on all my hardware purchases lol.
> So I had a crazy moment, where I became dillusional with the thought that 3x1 landscape served me better than 5x1 portrait so I broke my 5x1 setup down and gave 3x1 a go again...Man oohhh man I was soo wrong. 5x1 blows 3x1 away!! I took some comparison pics of 3x1 landscape versus 5x1 portrait at similar angles. Ya the 3x1 fits more of the "game world" in the frame but its horizonal resolution is so low its way to compressed and hard to see detail.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ]


mind sharing just your 3 monitors in portrait?


----------



## CallsignVega

I tested the Extreme 11 in 16x 2.0 mode on all slots (8x 3.0 like the Sniper 3 I had or the last three GPU slots on native X79) versus 16x 3.0 on all slots. I saw no performance difference. It's only when you drop down to 8x 2.0 will you be hurt with many GPU's / monitors. So the Extreme 11 even with Quad-16x via two PLX won't really do anything for you versus native X79. Will that change with the 89xx series? Only time will tell.

As a caveat, this only applies to AMD since they are native PCI-E 3.0 on X79 and not silly NVIDIA which has "issues". So the E11 will still be the best board for NVIDIA users.


----------



## Snipe3000

Hey Vega, Did you mention something about your PSU being over-tasked?
EVGA finally released their new 1500 PSU.


----------



## CallsignVega

Ya, I am thinking about getting it. Bit pricey though.


----------



## Quest99

Quote:


> Originally Posted by *CallsignVega*
> 
> Ya, I am thinking about getting it. Bit pricey though.


Did I read that right? Pricey? Didn't think it was in your vocabulary.


----------



## Samurai Batgirl

Vega, did you ever get the RAID (controller) problem fixed?


----------



## Nope oO

Quote:


> Originally Posted by *l88bastar*
> 
> 5x1 blows 3x1 away!! I took some comparison pics of 3x1 landscape versus 5x1 portrait at similar angles. Ya the 3x1 fits more of the "game world" in the frame but its horizonal resolution is so low its way to compressed and hard to see detail.


Does the horizontal refresh rate when flipped to portrait mode affect you? Horizontal refresh is usually lower and it's now your vertical. John Carmack said there's something like 15ms difference between the top of the monitor and the bottom when refreshing.

Does anyone have a picture of 3 Overlord Tempests in action? I take it they're the best 1440p 120hz monitors?


----------



## nvidia3

vega way you changed your setup from gtx 680 to 7970 ? ,wich card is better ?


----------



## RB Snake

nVidia surround can't do 5 monitors, only eyefinity can. So he couldn't use the GTX 680's.


----------



## British

Quote:


> Originally Posted by *Nope oO*
> 
> Does anyone have a picture of 3 Overlord Tempests in action? I take it they're the best 1440p 120hz monitors?


Those monitors should be available around october, so for the time beind it's all about Catleap when it comes to [email protected]

Vega had such a setup (3 Catleap in portrait IIRC) at some point, and I remember a video of Skyrim with it.
You might want to check that thread (or maybe another) for more details.

I can sum up for you what bothered me: the Catleap (and most likely all other 1440 monitors) have a much thicker bottom bezel, so the monitors had to be overlaped in portrait mode to make it not completely borked...


----------



## CallsignVega

I also have a 144 Hz Asus 27" inbound to test out.


----------



## MenacingTuba

Quote:


> Originally Posted by *CallsignVega*
> 
> I also have a 144 Hz Asus 27" inbound to test out.


Let me know what the anti-glare is like. It would be awesome if it was semi-glossy like their VG23AH


----------



## CallsignVega

I think it has the same panel as the 120 Hz 27", so it will be matte. Hopefully it isn't as sparkly and nasty as something like the U2711. Otherwise I just might have to give it the old wet paper towel job!


----------



## MenacingTuba

Quote:


> Originally Posted by *CallsignVega*
> 
> I think it has the same panel as the 120 Hz 27", so it will be matte. Hopefully it isn't as sparkly and nasty as something like the U2711. Otherwise I just might have to give it the old wet paper towel job!


Only matte LG IPS+TN's use the aggressive coating, I believe Asus is using an AUO TN panel, & like Samsung, they have always used lighter coatings. The VG278H I sawed in store looked like it had a lighter matte coating than the Samsung TN it was next to (medium AG) when I turned both off, but it definitely was not as reflective as the semi-glossy VA Samsung it was next too.


----------



## CallsignVega

Good to hear. That LG IPS coating is just ridiculous.


----------



## l88bastar

Quote:


> Originally Posted by *CallsignVega*
> 
> I also have a 144 Hz Asus 27" inbound to test out.


Here is a screen shot of bf3 in 1080p

















Quote:


> Originally Posted by *Descadent*
> 
> mind sharing just your 3 monitors in portrait?


Yep here ya go


----------



## CallsignVega

Quote:


> Originally Posted by *l88bastar*
> 
> Here is a screen shot of bf3 in 1080p
> 
> 
> 
> 
> 
> 
> 
> 
> Yep here ya go


I really don't know how you guys put up with those lines. Just get a big 70 inch TV set.


----------



## revro

Hello,

I spent entire Sunday reading this entire topic, while passively watching on second monitor a tv marathon







and just like Vega changed his way, I also based on his experience changed my proposed Rig build.

so now I plan to go next spring with 2x MSI 7970 Lightning 3GB, Haswell version of i7-3770k , 3/5x Benq XL2420T 120hz
-thin bezel if put behind each other and have pivot so 3/5x1 portrait is possible
I wont overclock anything, so i just hope there wont be a CPU bottleneck.

My questions:
1. what minidisplayport2dvi adapter are you using to get 120Hz on monitors, how much did it cost? Do you then connect it with DL-DVI to DVI on monitor so you get 120Hz?
2. are you using 2 SL-DVI cables and some special sys file? Where can we get the sys file from?
3. how many FPS does 2x 7970 Lightning on 5x 120Hz monitors in BF3?

thank you
revro

PS: LOL now I know from where name Alienware comes from


----------



## zdude

Quote:


> Originally Posted by *l88bastar*
> 
> Quote:
> 
> 
> 
> Originally Posted by *CallsignVega*
> 
> I also have a 144 Hz Asus 27" inbound to test out.
> 
> 
> 
> Here is a screen shot of bf3 in 1080p
Click to expand...

lol why did you steal my screenshot


----------



## armartins

Vega, any word about the randrive benchs? I can only wonder the numbers you will push with 2400Mhz quad-channel.


----------



## CallsignVega

The 64 GB 2400 MHz was a bust. Just can't run that kinda speed with that much memory. Not having $1k worth of RAM run much slower than rated so I sent it back.


----------



## 161029

Quote:


> Originally Posted by *revro*
> 
> My questions:
> 1. what minidisplayport2dvi adapter are you using to get 120Hz on monitors, how much did it cost? Do you then connect it with DL-DVI to DVI on monitor so you get 120Hz?
> 2. are you using 2 SL-DVI cables and some special sys file? Where can we get the sys file from?
> 3. how many FPS does 2x 7970 Lightning on 5x 120Hz monitors in BF3?
> 
> thank you
> revro


I don't know if two 7970's will cut it so I would advise against doing 5 portrait setups. You'll need a minimum of 120fps for 120Hz to be effective IIRC.

1. I don't know but Monoprice has a great one but it takes up one USB port. http://www.monoprice.com/products/product.asp?c_id=104&cp_id=10428&cs_id=1042802&p_id=6904&seq=1&format=2
2. I believe he is just using DVI-D dual link cables.
3. My best guess is that it'll probably be under 120fps so 120Hz won't be able to shine.


----------



## l88bastar

Quote:


> Originally Posted by *CallsignVega*
> 
> I really don't know how you guys put up with those lines. Just get a big 70 inch TV set.


I know I am selling everything and getting three of these for portrait surround 2400x1280p 3D gaming and going with Nvidia...No more bezels for me















http://www.provantage.com/nec-display-solutions-np-v300w~ANECU0FA.htm


----------



## Quest99

Quote:


> Originally Posted by *l88bastar*
> 
> I know I am selling everything and getting three of these for portrait surround 2400x1280p 3D gaming and going with Nvidia...No more bezels for me
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.provantage.com/nec-display-solutions-np-v300w~ANECU0FA.htm


Link doesnt work for me.


----------



## Mals

It is 3 projectors.. no bezels indeed.. you are going to have to play in pitch dark to get any kind of image quality.. what kind of response time/input lag/etc are you getting from gaming in projectors..? i don't see this being a good idea =P


----------



## placidity

CallsignVega I love your setup! I wanted to ask your advice on something. I've finally upgraded to the X79 platform (just haven't upgraded my sig until it's put together) and I was wondering if you can debezel the Korean Catleap or Crossover monitors and run them in 5x1? I know it would require a ton of GPU power I was just wondering if it can be done.


----------



## nathanak21

Quote:


> Originally Posted by *placidity*
> 
> CallsignVega I love your setup! I wanted to ask your advice on something. I've finally upgraded to the X79 platform (just haven't upgraded my sig until it's put together) and I was wondering if you can debezel the Korean Catleap or Crossover monitors and run them in 5x1? I know it would require a ton of GPU power I was just wondering if it can be done.


He tried 3 2b catleaps and couldent get what he wanted.


----------



## revro

Vega had to switch to 5 1920x1080 samsung 120hz lcds.

hmm these active adapters are too costly. 3-4 per100eur, it costs price of one 120hz benq monitor.
what adapters is vega using for 120hz, and is he using sl-dvi or dl-dvi to get 120hz on dvi?

thank you
revro


----------



## GREG MISO

nvidia surround doesnt support more than 3 screens.


----------



## wongwarren

Quote:


> Originally Posted by *GREG MISO*
> 
> nvidia surround doesnt support more than 3 screens.


It now does 3+1.


----------



## GREG MISO

Quote:


> Originally Posted by *wongwarren*
> 
> It now does 3+1.


Let me rephrase that. Nvidia surround does not support 5 screens.


----------



## dmanstasiu

Quote:


> Originally Posted by *wongwarren*
> 
> It now does 3+1.


Yeah but you can still only stretch the game across 3 monitors, not 4. Thus, the Surround portion affecting gaming is still 3... IIRC


----------



## Snipe3000

Quote:


> Originally Posted by *l88bastar*
> 
> I know I am selling everything and getting three of these for portrait surround 2400x1280p 3D gaming and going with Nvidia...No more bezels for me
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.provantage.com/nec-display-solutions-np-v300w~ANECU0FA.htm


Blended projectors are nice, but things to get a bit more complicated if you go that route.


----------



## AikenDrum

Quote:


> Originally Posted by *Snipe3000*
> 
> Blended projectors are nice, but things to get a bit more complicated if you go that route.


If I had a darkened room, I'd do it. I'd get a curved screen and try out nthusim. Alas, I do not have a darkened room.

I think I'd pick a higher-res projector, though. I dunno about 1280x800. I guess if you blend three+ of 'em in portrait, you'd get a fair number of pixels, but... still... it's a well-known fact that MORE IS ALWAYS BETTER.


----------



## AikenDrum

Quote:


> Originally Posted by *CallsignVega*
> 
> The 64 GB 2400 MHz was a bust. Just can't run that kinda speed with that much memory. Not having $1k worth of RAM run much slower than rated so I sent it back.


Well, damn. I figured if anyone was gonna get it working, it was going to be you. And then I could cheat off of your homework, because that's easier than doing all of the nasty legwork myself.







Oh, well. Guess I have to mess with it myself again.

By the way, speaking of stealing your wisdom,I had a question for you... if you were someone like me, who has a solid reason why it's always going to be 3 landscape monitors and never 5 portrait monitors (the reasons are complicated, just take my word for it), would you still go with AMD or would nVidia come back into the picture?


----------



## Pip Boy

http://www.youtube.com/watch?v=dOY2lREuwjU

do this next ^


----------



## 161029

Quote:


> Originally Posted by *phill1978*
> 
> http://www.youtube.com/watch?v=dOY2lREuwjU
> 
> do this next ^


Full HD projectors are only 1080p, not 1440p.


----------



## PCModderMike

Quote:


> Originally Posted by *l88bastar*
> 
> Lol me too! Mine added a 20% 6pm.com shoe surtax on all my hardware purchases lol.
> So I had a crazy moment, where I became dillusional with the thought that 3x1 landscape served me better than 5x1 portrait so I broke my 5x1 setup down and gave 3x1 a go again...Man oohhh man I was soo wrong. 5x1 blows 3x1 away!! I took some comparison pics of 3x1 landscape versus 5x1 portrait at similar angles. Ya the 3x1 fits more of the "game world" in the frame but its horizonal resolution is so low its way to compressed and hard to see detail.
> 
> 
> Spoiler: Warning: Spoiler!


Wow the 5x1 does look a lot better than 3x1. I used to run nVidia surround at one time, and I liked it a lot, but after seeing a side by side comparison like that, and of course watching all of Vega's videos on YouTube...I see 5x1 in my future! Just curious, what are your specs? And what kind of monitors are those?


----------



## stren

Quote:


> Originally Posted by *PCModderMike*
> 
> Wow the 5x1 does look a lot better than 3x1. I used to run nVidia surround at one time, and I liked it a lot, but after seeing a side by side comparison like that, and of course watching all of Vega's videos on YouTube...I see 5x1 in my future! Just curious, what are your specs? And what kind of monitors are those?


+1 Thanks for the compare - you really sold me on 5x1 as the way forward. It's silly how ati require all the outputs on one card though, if you support so many screens why not make the outputs flexible. Now I should get rid of these 580's









FYI L88bastard uses a similar setup to vega, the 120hz samsungs debezelled (bastard uses the 27" I think and Vega the 23") and quad 7970 lightnings.

Can't wait for vega's info on the 16x vs 8x for quad-XFire. And some feedback on the 144Hz screens


----------



## PCModderMike

Quote:


> Originally Posted by *stren*
> 
> +1 Thanks for the compare - you really sold me on 5x1 as the way forward. It's silly how ati require all the outputs on one card though, if you support so many screens why not make the outputs flexible. Now I should get rid of these 580's
> 
> 
> 
> 
> 
> 
> 
> 
> *FYI L88bastard uses a similar setup to vega, the 120hz samsungs debezelled (bastard uses the 27" I think and Vega the 23") and quad 7970 lightnings.*
> Can't wait for vega's info on the 16x vs 8x for quad-XFire. And some feedback on the 144Hz screens


Phew, way out of my league.


----------



## Mhill2029

Quote:


> Originally Posted by *stren*
> 
> +1 Thanks for the compare - you really sold me on 5x1 as the way forward. It's silly how ati require all the outputs on one card though, if you support so many screens why not make the outputs flexible. Now I should get rid of these 580's
> 
> 
> 
> 
> 
> 
> 
> 
> FYI L88bastard uses a similar setup to vega, the 120hz samsungs debezelled (bastard uses the 27" I think and Vega the 23") and quad 7970 lightnings.
> Can't wait for vega's info on the 16x vs 8x for quad-XFire. And some feedback on the 144Hz screens


I've always wondered where those 2 get the time to do it, setting up/modding and configuring their setups is very time consuming and they spend a lot of time on OCN fiddling with their systems and new products.....where do they get the time to work for a living is what i'm wondering lol

I've never seen anyone shell out cash like VEGA does (well maxishine maybe) on hardware, he gets monitors like they fall from the sky. Where does all this money come from is what i'm curious about. Tis not normal....


----------



## armartins

Quote:


> Originally Posted by *Mhill2029*
> 
> I've always wondered where those 2 get the time to do it, setting up/modding and configuring their setups is very time consuming and they spend a lot of time on OCN fiddling with their systems and new products.....where do they get the time to work for a living is what i'm wondering lol
> I've never seen anyone shell out cash like VEGA does (well maxishine maybe) on hardware, he gets monitors like they fall from the sky. Where does all this money come from is what i'm curious about. Tis not normal....


In fact Vega and I88bastar do run a nasty business that uses some sort of steganography that need 4 7970s highly oc'd to be decoded and it need to be displayed on 120hz 5x1080p setup at once to be understood... meanwhile they make us drool and spend our money.... My guess is that Vega does some sort of transport (like a helicopter transporter version of the movie feat. Jason Statham) and I88bastar is the dealer or the middle man... Even though i didn't figured yet why I88 uses 27"... my guess is that he has slight worse sight.









On another note... have some fun stuff being done in the new home see if you guys can solve this mystery:

Have this set running up


to this place at the top


and another similar set running down to that place under the bricks...


P.S. 1 = @maxishine I don't know the guy but building an SR-2 Gaming Machine, using a TV as monitor, a GTX690 for 3x 30", 7770s GPUs and 15" screens for his friends tell me a lot about him.

P.S. 2 = Vega reply my Heat evaluation and also here if possible!


----------



## nvidiaftw12

Geothermal. Nice.


----------



## Descadent

has anyone gotten gw2 to work at 4320x2560? just flickers and crashes.


----------



## l88bastar

Lol no I have my own real estate consulting business and my gaming rig doubles as my primary work rig. Five screens really helps productivity alot









While expensive, my rig has been an evolution five years in the making...what started in 2007 as a lowly core 2 quad, 8800 ultra and XHD3000 has morped over the years into what I have now. A continual upgrading & selling & upgrading process over the years has spread the wallet pain out considerably


----------



## CallsignVega

Asus VG278HE 144 Hz 27" 1080P and Samsung S23A750D 23" 120 Hz comparison:

The Asus comes in a nice thick sturdy box and is well protected. Asus does a nice job on presentation. The monitor has a nice black gloss bezel that is matte on the inside edge to alleviate any glare from the screen. The stand is very sturdy and has a real nice height adjustability. As a tall bastard, I think this may be my first monitor that I've had since the HP ZR30W that can adjust to my viewing height without having to prop up anything underneath it. That is a huge plus!

The base of the stand is also gloss black, but alas it has a very large IMO silly 3D symbol. The monitor also swivels and tilts which is nice. Three connection options (really only one as the first one is the only one that can run 144 Hz), DL-DVI, HDMI and VGA D-Sub 15. Not sure who or why anyone would use the VGA connection. One thing I dislike about the connections is the DVI port is _very_ close to the body of the monitor. My big-daddy 24 gauge Monoprice high quality cable barely fits on there at an angle. Feels like it's going to break that DVI connection. No need for it to be so close to the back of the monitor.

The screen is a matte screen, but it does not have a large amount of diffusion and sparkle like an LG IPS panel. If the Apple cinema display (which has the glossiest screen I've used) was a 10, and a U2711 was the worst matte nastiness at 1, I'd rate this panel's anti-glare around a 3.5. I prefer glossier panels as will be discussed below. The Samsung isn't quite as reflective as the Apple, but it is still pretty strong at an 8.5. I would rate the Yamakasi Catleap about a 7. So to break gloss levels down:

Apple Cinema Display: 10
Samsung 120 Hz line: 8.5
Gloss Catleap's and variants: 7
Asus 144 Hz: 3.5
LG IPS (like U2711) 1

The on-screen display on the Asus could be better. It feels a bit clunky in the manner you switch different configuration options. The key's are tactile and work good. Nothing beats good old tactile buttons. The touch-sensitive buttons on the Samsung can be quite annoying. The power LED on the monitor is ingeniously lit underneath the bottom right portion of the bezel. This means it is not in your view-able area in a normal sitting position. I don't like bright LED's shining right at me on the edge of a screen. I usually darken those out with a permanent marker.

Now the Asus does have some back-light bleed around the edges, mostly on the bottom but the bleed does not extend out far. I would rate BLB low on my priority list as it usually does not affect normal viewing and I rarely view anything with very dark or solid black images. My unit did come with zero dead nor stuck pixels. Hopefully that is a good sign that Asus does some quality control on these panels as they are priced fairly high. Of course I could have always gotten lucky in the panel lottery.

Back to the anti-glare coating. While using the monitor in the day-lit room here, it was a pleasure to really not see any reflection besides a minimal diffused glow that was barely visible. On the other hand, the Samsung next to it reflects quite a bit and can be quite distracting. Gloss screens should really only be used in pretty light controlled environments. Now here comes the trade-off. There is always a trade off when it comes to LCD tech. While you virtually eliminate the reflections with the Asus, you also lose a lot of the subjective "pop" and "wet-look" of the colors and the deeper perceived blacks. Blacks just look a lot better on the Samsung. Now I didn't take any fancy colorimeter readings or any of that jazz, this is entirely subjective of someone who has tested dozens and dozens of monitors down through the years.

The Samsung has a bit of a better picture quality. The better blacks, the colors pop more and the contrast just seems to edge out the Asus. Whether that is entirely up to the gloss coating or Samsung wizardry I do not know. So that brings us to pixel pitch. 1920x1080 at 27" just doesn't work for me if I sit too close. You can clearly see that text is rougher and sharpness is decreased at this large of a size. Everything appears clearer and sharper on the 23" Samsung, allowing you to sit close. Now you can alleviate some of that problem with the Asus simply by pushing the monitor further back on your desk, but then you lose the "immersion" factor that having a larger screen is suppose to provide. For a normal viewing distance from the eyes of around 24-30 inches, 27-30" monitors should solely be the realm of 1440 or 1600P. Although, the larger pixels are more noticeable in Windows and in web pages than they are in games. Since in games, there usually isn't a lot of text or compartmentalized information all over the place.

Text size in a side-by-side:










So the big question: How is *144 Hz* ?!?! Honestly, the jump from 120 Hz to 144 Hz is not as great as I was expecting. Definitely getting into the point of diminishing returns. I tried to capture any "measurable" differences using Pixperpan and other means but they did not "pan out". Pixperan would crash always at 144 Hz but would work fine at 120 Hz. So I could only get reading for the Samsung. I then set upon taking high film speed shots of each monitor in Skyrim during the AFK fast camera spinning. Due to monitor cloning limitations that set both to 120 Hz, I was unable to do a side-by-side in one shot. Also due to different LED back-light modulation, high ISO settings produced view-able images on one monitor and not the other. Change the ISO settings and the monitors would flip-flop. I did not get any conclusive images. So that brings it back to my subjective observations. As a pretty good long time FPS player I think I have a decent grasp of how a monitor "handles".

Just for a point of reference, let's say I feel a jump from 60 Hz to 120 Hz produces about a 50% increase in the images "smoothness". Of course the faster the image is displayed, the % of return in smoothness feel will decrease exponentially. While an increase from 120 Hz to 144 Hz is a mathematical 20% speed increase, I would put my subjective increase in smoothness feel around 5 to 8%. Now whether that is getting into placebo territory, or simply the larger image versus the 23" Samsung is making a "perceptual" difference, I could not say. I would honestly say though after playing BF3 and Guild Wars 2 (which is silky smooth) at 144 Hz and 144+ FPS, that the Asus does have a small edge on the Samsung. If you are expecting this monitor to have a _large_ difference in play-ability over 120 Hz due to it's extra 24 Hz, you will be disappointed.

I also did notice on scrolling web pages using smooth scroll that the Asus smeared just a tad less than the Samsung. I truly got spoiled with the FW900 as smooth scrolling text, even fast was crystal clear. LCD'd don't handle that too well. Although, this is the LCD that has handled for me smooth scrolling text the best. Even a somewhat quick IPS screen like my 120 Hz Catleap smeared pretty bad and did not make viewing smooth scrolling text all that enjoyable.

Input lag:

SMTT 2.0, ISO 500, Samsung left Asus Right.



















I would say the Asus starts a new frame about 8 milliseconds faster than the Samsung. I believe the last I read the Samsung panels are around 12-15ms of input lag, which would put my readings pretty close to Prad's rating of the 120 Hz version of the Asus at ~4ms. That is pretty quick and basically zero input lag.

Aesthetically, I would give the nod to the Samsung's. Especially the 950D line. Full brushed aluminum/gloss black enclosures, with nice clear/black gloss trim. I think they are very stylish. While not unattractive, the Asus is your pretty typical black square rectangle that virtually all monitors adhere to.

Images of the Samsung next to the Asus:





































So where does the 144 Hz Asus fit in the grand scheme of things? Until we get the perfect display technology, you will always have to compromise with LCD. You just have to rate the priorities of the monitors design to fit you. Vega's crude and simple rating system:

In order of: Samsung S23A750D, Asus VG278HE, 2B Yamakasi Catleap.

Image Quality: |<- - - - - - 7 - - ->| |<- - - - - 6 - - - ->| |<- - - - - - - - 9 ->|
*Motion Smoothness*: |<- - - - - - - 8 - ->| |<- - - - - - - - 9 ->| |<- - - - - - 7 - - ->|
Text/OS: |<- - - - - - 7 - - ->| |<- - - - - - 6 - - ->| |<- - - - - - - - 9 ->|
Games: |<- - - - - - 7 - - ->| |<- - - - - - - 8 - ->| |<- - - - - - - 8 - ->|
Aesthetics: |<- - - - - - - 8 - ->| |<- - - - - - 7 - - ->| |<- - - - - 6 - - - ->|

Totals (As a gamer, motion has 2x score weight): Samsung (7.5) - Asus (7.5) - *2B Catleap* *( 7.66)*.

For an overall monitor that does a lot of things well and nothing really bad (besides a flimsy stand that will be corrected in upcoming versions), I'd still have to give the nod to the 120 Hz Catleap. It has a great IPS picture and 1440P resolution to go along with it's decent motion. Short of getting yourself a FW900, the Asus will do you quite well for competitive gaming. I don't think there is a better LCD out there for that. Just don't sit too close to it.







One thing to note about the 2B Catleap and it's future Overlord Tempest OC models: only select hardware configs can run it optimally. You see, a single overclocked GTX 680 or 7970 can handle the 1080P Asus or Samsung fairly well. With ~85% more pixels on the Catleap, not so much. This forces you into SLI or crossfire. SLI and NVIDIA are ruled out as they have adapted a 330 MHz pixel clock limit in their drivers and refuse to budge. This limits your Catleap to around 82 Hz. Really, the only combination of hardware and (modified drivers mind you) in order to "over clock" the DL-DVI port is the AMD 79xx series. And to push that many pixels to do a 1440P Catleap justice required at least two of them.

The sweet spot for optimal gaming for a decent budget is an IB CPU, 2x 7970's and an Overclock Catleap or it's upcoming variations. Let me know if you have any questions.


----------



## Mhill2029

Now that is my kind of review, thanks for taking the time bro +Rep


----------



## armartins

+rep also Vega. I guess by now you should be reassembling those samsung monitors faster than any asian lol. On the other hand your positive comments on the Catleap are making me reconsider the 120hz version of it again


----------



## esp42089

That is pretty awesome. Sounds like you would be on Catleaps right now if they weren't so difficult to get drivers and everything to play nice. Do you have any experience with the 7970s on X58 platform? I have been contemplating a jump to 2 7970s but I'm not sure it's worth it on my rig.


----------



## CallsignVega

The 120 Hz Catleap's are way too demanding for a 5x1 Setup. They make great single monitors. 7970's will run fine on your X58 system. They would each have their own 16x 2.0 slot which is good. You can bump up that 980x a bit too.


----------



## hammerforged

Well done Vega. We all most certainly appreciate the effort you put into this review.

I currently have a 1440p catleap non 2B. I miss 120hz smoothness and I used to have a S27A750D. Would you consider the switch from my catleap to this 144hz Asus a good move for someone who misses 120hz? Or suggest another 120hz monitor?

Thanks


----------



## stren

Thanks Vega - so it sounds like you won't be changing your 5x1 setup to the asus monitors then?


----------



## revro

Hello,
Quote:


> Originally Posted by *CallsignVega*
> 
> I found that even using single link DVI connection of the 7970 Lightnings (165 MHz), I can get around 75 Hz out of the 700D (right landscape monitor). So it will still be better than 60 Hz and not as noticeable as it will be an end screen on the 5x1 setup. Eyefinity is great in that it will allow four screens to run at 120Hz and an odd ball screen to run at a lower Hz.


so to get 120hz you are using on 4 samsung S23A750 lcds, the mini display port to display port cable - thats just simple cabel so no expensive adapter needed
and the fifth samsung lcd is connected via dvi cable and runs at 75Hz?

also would trifire 7970 be enough to feed 5 1920x1080 @120Hz?

thank you
revro


----------



## chuckinbeast

Great review Vega, thanks for that! +REP


----------



## l88bastar

Quote:


> Originally Posted by *revro*
> 
> Hello,
> so to get 120hz you are using on 4 samsung S23A750 lcds, the mini display port to display port cable - thats just simple cabel so no expensive adapter needed
> and the fifth samsung lcd is connected via dvi cable and runs at 75Hz?
> also would trifire 7970 be enough to feed 5 1920x1080 @120Hz?
> thank you
> revro


No Vega has 4 Samsung S23A750s all running off of the 7970 lightnings 4 mini-displayports and his 5th monitor is a Samsung S23A700 and that is running off of the lightnings single link DVI port which he overclocked with ToastyX's modified driver changer which allows higher refresh rates
http://www2.120hz.net/showthread.php?270-Modified-AMD-ATI-driver-to-allow-higher-refresh-rates

I am using 4 Samsung S27A750s & 1 Samsung S27a950 because I already had three perfect S27a750s on hand and it was easier & cheaper to pick up two more 27" monitors then find 5 perfect 23"s because they don't manufacture these displays anymore. However, I did briefly try 5x1 with 23"s and I can't say one way or the other if 27s or 23s are better. They both bring something different to the table. I like the higher pixel density of the 23"s but the 27"s lack of pixel density is less noticeable once they are rotated in portrait. I like the greater size of the 27"s but the size benefit looses its punch in 5x1 as 5 23s are perfectly BIGGEM enough.
Quote:


> Originally Posted by *AikenDrum*
> 
> If I had a darkened room, I'd do it. I'd get a curved screen and try out nthusim. Alas, I do not have a darkened room.
> I think I'd pick a higher-res projector, though. I dunno about 1280x800. I guess if you blend three+ of 'em in portrait, you'd get a fair number of pixels, but... still... it's a well-known fact that MORE IS ALWAYS BETTER.


Quote:


> Originally Posted by *phill1978*
> 
> http://www.youtube.com/watch?v=dOY2lREuwjU
> do this next ^


I already tried the edge blending multi-projector fandango two years ago. I did not like the middleman edgeblending software because it caused additional conflict between GPU drivers and the games coding. Long story short, it was a friggin nightmare getting games to work properly. Maybe they have improved the edgeblending software & support since then but I have no desire to go that route ever again. Heres my old setup, its two edgeblendid1080p projectors on a 4'x12' custom curved screen that I designed and built myself:
http://www.youtube.com/watch?v=ApfvK9f9eMI

The only thing I regret with my 5x1 setup is the lack of 3d functionality. I really, really like 3D when its done right. But 3D is not so great on these Samsungs because they have a tendency for annoying crosstalk on brigther high contrast scenes. So I am going to do three 720p DLP projectors in portrait on my second rig which has a 3770k & GTX 680x. DLP projectors are awesome with motion, they are not as good with motion as CRT monitors but they are head and shoulders better than 120hz LCD displays. Also, DLP projectors have ZERO crosstalk...their only downfall is lowly 720p resoultion, which hopefully will be remedied by having three of em in portrait. I think another 680GTX in SLI should be able to handle 2400x1280 for 3D gaming goodness.


----------



## Snipe3000

HerculeanScreens!?, I haven't heard from you in quite a while, so this is where you've been.


----------



## CallsignVega

Multi-projector setups are always fail. Don't look into the proverbial light!









Just remember even three 720P projectors will have significantly less pixels than a single 1440P monitor. Bad on the eyes, good on the GPU's I guess.


----------



## mlb426

Vega, you have a link to that last photo in your Samsung vs Asus review of the castle? Cool shot


----------



## revro

well it just hit me, that for 5x1 setup with benq xl2420t [email protected] 120hz i would not need any dual link active displayport adapters as this monitor has itself DP1.2

ah i just spent hours thinking about procuring 4 these adapters 100eur each, lol i am so brainless








i mean i just need normal miniDP to DP cable and i can game 120hz ...

as wise man once said, one thing is to know to read, another to understand









best
revro


----------



## l88bastar

Quote:


> Originally Posted by *Snipe3000*
> 
> HerculeanScreens!?, I haven't heard from you in quite a while, so this is where you've been.


Ya the start up costs were too high for such a niche product and driver / software conflicts were a total nightmare, so I abandoned it. Rise of Flight was the only game I really got to enjoy on the setup and one game is not enough to justify such a setup.......unless that game is BF3 lol.
Quote:


> Originally Posted by *CallsignVega*
> 
> Multi-projector setups are always fail. Don't look into the proverbial light!
> 
> 
> 
> 
> 
> 
> 
> 
> Just remember even three 720P projectors will have significantly less pixels than a single 1440P monitor. Bad on the eyes, good on the GPU's I guess.


Awesome review of the asus 144 BTW! Ya but the projectors are only for 3D gaming...catlickers can't do 3D and samsungs pretty much suck at 3D...also DLP projectors are really great with motion and color...Im keeping the 5x1 setup as long as I don't throw it over the balcony from all of this dead 4th gpu aggrivation (post #124 for anybody interested in my personal hell)
http://www.xtremesystems.org/forums/showthread.php?281232-MSI-HD-7970-Lightning-water-block/page5

I'm gonna get one of those overlord tempest 120hz 1440p displays when they are available and see what all of the hub bubb is about...im also probably gonna get one of those 21:9 displays too as I hear they actually come with the female asian model who will soap and scrub your......feet.....while you game








http://www.engadget.com/2012/08/27/lg-ea93-ultrawidescreen-ea83-wqhd-monitor/


----------



## CallsignVega

Quote:


> Originally Posted by *mlb426*
> 
> Vega, you have a link to that last photo in your Samsung vs Asus review of the castle? Cool shot


Here you go: http://windows.microsoft.com/is-IS/windows/downloads/european-castles-theme
Quote:


> Originally Posted by *revro*
> 
> well it just hit me, that for 5x1 setup with benq xl2420t [email protected] 120hz i would not need any dual link active displayport adapters as this monitor has itself DP1.2
> ah i just spent hours thinking about procuring 4 these adapters 100eur each, lol i am so brainless
> 
> 
> 
> 
> 
> 
> 
> 
> i mean i just need normal miniDP to DP cable and i can game 120hz ...
> as wise man once said, one thing is to know to read, another to understand
> 
> 
> 
> 
> 
> 
> 
> 
> best
> revro


Ya that will work fine for four of them. You just have to figure out how to get the 5th at 120 Hz. Modified drivers/etc.
Quote:


> Originally Posted by *l88bastar*
> 
> Ya the start up costs were too high for such a niche product and driver / software conflicts were a total nightmare, so I abandoned it. Rise of Flight was the only game I really got to enjoy on the setup and one game is not enough to justify such a setup.......unless that game is BF3 lol.
> Awesome review of the asus 144 BTW! Ya but the projectors are only for 3D gaming...catlickers can't do 3D and samsungs pretty much suck at 3D...also DLP projectors are really great with motion and color...Im keeping the 5x1 setup as long as I don't throw it over the balcony from all of this dead 4th gpu aggrivation (post #124 for anybody interested in my personal hell)
> http://www.xtremesystems.org/forums/showthread.php?281232-MSI-HD-7970-Lightning-water-block/page5
> I'm gonna get one of those overlord tempest 120hz 1440p displays when they are available and see what all of the hub bubb is about...im also probably gonna get one of those 21:9 displays too as I hear they actually come with the female asian model who will soap and scrub your......feet.....while you game
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.engadget.com/2012/08/27/lg-ea93-ultrawidescreen-ea83-wqhd-monitor/


I want my asian model to do a lot more than that.









I was thinking about getting some v2 NVIDIA glasses to try 72 Hz per eye on this Asus. I hear it it's just about the best 3D monitor out there with it's double 3D brightness, higher Hz and zero cross-talk.


----------



## Descadent

for as much money as you two spend. you should have an asian model to do more than that lol


----------



## l88bastar

Hahaha throw in some bezeless displays and you have Vegas heaven right here lol
http://discotreats.com/wp-content/uploads/gwen-stefani-harajuku-girls1.jpg

Ya I would be very interested in how the 144hz handles 3D! If it doesnt have any crosstalk in 3D like the samsungs do I would be temped to ditch the 3D projector idea.
Quote:


> Originally Posted by *Descadent*
> 
> for as much money as you two spend. you should have an asian model to do more than that lol


Flashbacks of being chased out of tawdry Southeast Asian rub n tug parlors by crazed stick weilding shopkeepers beseech me.

premium


----------



## GnarlyCharlie

Quote:


> Originally Posted by *l88bastar*
> 
> Flashbacks of being chased out of tawdry Southeast Asian rub n tug parlors by crazed stick weilding shopkeepers beseech me.
> premium


Yeah, me too.. And I just have to keep telling myself, "Hey, I've been thrown out of better joints than THIS!"


----------



## CallsignVega

Asus is sneaky. This is really only a 2D monitor, hence why they removed the IR transmitter and 3D glasses. 144 Hz is *2D* only. So no 72 Hz per eye 3D for anyone. I don't even know if the NVIDIA glasses could shutter that fast anyway.


----------



## l88bastar

You need the 3DVision 2 kit. Its got the IR trasnmitter an everything ya need. I got mine for $90 on ebay.


----------



## CallsignVega

No, the manual explicitly states that it will not work in 3D mode at 144 Hz. 120Hz only.


----------



## l88bastar

Quote:


> Originally Posted by *CallsignVega*
> 
> No, the manual explicitly states that it will not work in 3D mode at 144 Hz. 120Hz only.


http://www.youtube.com/watch?v=WsJSRP7cZVo


----------



## Mhill2029

Quote:


> Originally Posted by *CallsignVega*
> 
> No, the manual explicitly states that it will not work in 3D mode at 144 Hz. 120Hz only.


What!!! So what's the point in it? I thought that was the whole reasoning for the existing of the Asus 144Hz model, for 72Hz per eye 3D greatness. If it's restricted to 120Hz 3D there's going to be a few dissapointed reviewers and owners out there saying................. errrrrrrr what?

Way to go Asus.......not only do they rave about there Mars III and pull it from release they do this with a monitor.


----------



## British

Quote:


> Originally Posted by *Mhill2029*
> 
> What!!! So what's the point in it?


2D smoothness ?









I'd take it if it wasn't so big in size and so low in resolution.


----------



## CallsignVega

Ya, it's only for the extra 2D smoothness. Played some more BF3, it's pretty darn smooth.


----------



## Snipe3000

Thanks Vega & l88bastar, this turned out well


----------



## l88bastar

So many of this monitors press release mislabel its as being 72hz per eye in 3D mode and they are totally wrong. Vega and the 3D vision blog are like the only two sources that have clarified this...baah...whatever....I gues the real question is How does she look naked without dem bezelersz








Quote:


> Originally Posted by *Snipe3000*
> 
> Thanks Vega & l88bastar, this turned out well


Very, very nice! 3x1 landscape is the best for driving and flight sims








What racing games you playin on that? I wish they would do a remake of the classic game "Stunts" or a remake of the original Need for Speed from the 3DO. Those display panels are pretty cool once you remove them from the bezel aren't they They reminded me of a large cookie tray.......mmmmmm cookies.
Quote:


> Originally Posted by *GnarlyCharlie*
> 
> Yeah, me too.. And I just have to keep telling myself, "Hey, I've been thrown out of better joints than THIS!"


AhhhhHahahahahhhahahaha.


----------



## Snipe3000

Quote:


> Originally Posted by *l88bastar*
> 
> Very, very nice! 3x1 landscape is the best for driving and flight sims
> 
> 
> 
> 
> 
> 
> 
> 
> What racing games you playin on that? I wish they would do a remake of the classic game "Stunts" or a remake of the original Need for Speed from the 3DO.


iRacing, rFactor 2, Dirt 3, pCars, RBR, F1 2011, simraceway


----------



## revro

beh. only racing game i really liked was Death Rally







tough i absolutely cannt imagine it on eyefinity oO
http://www.youtube.com/watch?v=qdGters14Ps


----------



## l88bastar

Ha..I cannot wait to play the new & modern Doom on my 5x1
http://www.youtube.com/watch?v=rRIy5ymD3uc


----------



## yakapo

Quote:


> Originally Posted by *Snipe3000*
> 
> Thanks Vega & l88bastar, this turned out well


This... is what I'd like to do eventually. However I think I'm going to get that 27:9 display as a low cost entry into widescreen gaming. I just got a GTX 670 and I'm hoping its enough to push that display. Obviously it wouldn't be enough for 3 displays. Hmm, maybe I should have got a 7970 instead.

Has anyone here tried console gaming with a tv that interpolates frames to make it 120hz? What if they made a monitor that interpolated frames? Could you get essentially the same result as 120fps on a genuine 120hz monitor? That way you don't need multiple cards to get extra smooth gameplay with higher resolutions.


----------



## Snipe3000

Quote:


> Originally Posted by *yakapo*
> 
> This... is what I'd like to do eventually. However I think I'm going to get that 27:9 display as a low cost entry into widescreen gaming. I just got a GTX 670 and I'm hoping its enough to push that display. Obviously it wouldn't be enough for 3 displays. Hmm, maybe I should have got a 7970 instead.
> Has anyone here tried console gaming with a tv that interpolates frames to make it 120hz? What if they made a monitor that interpolated frames? Could you get essentially the same result as 120fps on a genuine 120hz monitor? That way you don't need multiple cards to get extra smooth gameplay with higher resolutions.


Massive amounts of input lag


----------



## CallsignVega

Quote:


> Originally Posted by *Snipe3000*
> 
> Thanks Vega & l88bastar, this turned out well


Fabulous driving setup! I bet that thing is awesome in racing sim's. I usually end up just trying to run everyone off the road though.


----------



## l88bastar

Quote:


> Originally Posted by *CallsignVega*
> 
> Fabulous driving setup! I bet that thing is awesome in racing sim's. I usually end up just trying to run everyone off the road though.


Not me...Im usually the disgruntled griefer that gets run off the road, then turns to the dark side and drives the wrong way, playing chicken against oncoming traffic


----------



## utnorris

Just saw your for sale thread Vega, are you getting rid of the dream machine?


----------



## feteru

Quote:


> Originally Posted by *utnorris*
> 
> Just saw your for sale thread Vega, are you getting rid of the dream machine?


Saw that too? What'cha doing? What was the problem with this setup?

EDIT: Whoops, Congrats on becoming a father (soon)!


----------



## l88bastar

Damn Vega it sucks that you gotta sell all your stuff, but congrats to the father to be with your little one on the way! Kids can be very expensive and I can't wat to see what kind of Crib you can mod! Do you know if its going to be a boy or girl yet??? Whens the due date again, I know it was soon?


----------



## utnorris

Congrats to the soon to be father!


----------



## Snipe3000

Grats


----------



## CallsignVega

I wouldn't call him a "child" when he arrives as he gets angry:



GW2 is still borked-crashing with the 5x1 setup and I am moving soon, so this version of the PC will be wrapped up. When I get set up in my new place hopefully the 89xx series will be coming out and maybe a ~4k resolution screen.

In the meantime the 144 Hz Asus is rocking BF3/GW2 swimmingly on my second PC with a 1400 MHz 680 Classified. The more I use this monitor the more I like it.


----------



## b0z0

Congrats on the other hand


----------



## Citra

Congrats Vega!


----------



## zosothepage

Congrats bro


----------



## 161029

Congratulations!


----------



## revro

congratulation

hmm now i see a big problem with 5x1 portrait with benq xl2420t as its edges are not suited to put one monitor bezel behind other ...
http://content.hwigroup.net/images/products/xl/132184/4/benq_xl2420t.jpg
http://i50.tinypic.com/34624h1.jpg

best
revro


----------



## revro

double post, please delete ...


----------



## CallsignVega

Quote:


> Originally Posted by *revro*
> 
> congratulation
> hmm now i see a big problem with 5x1 portrait with benq xl2420t as its edges are not suited to put one monitor bezel behind other ...
> http://content.hwigroup.net/images/products/xl/132184/4/benq_xl2420t.jpg
> http://i50.tinypic.com/34624h1.jpg
> best
> revro


You would need to remove the bezels/case.


----------



## l88bastar

Hey Vega, I know all your stuff is for sale but let us know if your gonna be giving away any free stuff again! That was awesome last time and I still got that 2700k you sent me rocking my backup rig!


----------



## feteru

Quote:


> Originally Posted by *l88bastar*
> 
> Hey Vega, I know all your stuff is for sale but let us know if your gonna be giving away any free stuff again! That was awesome last time and I still got that 2700k you sent me rocking my backup rig!


Please do, I missed it last time due to camping, but that was incredible.


----------



## armartins

Here we go again... fill Vega's inbox with "give me that messages" lol. Congrats! May those Vertex turn into diapers! =)


----------

