µATX Part 2: Intel G33 Performance Review
by Gary Key on September 27, 2007 3:00 AM EST- Posted in
- Motherboards
MSI G33M: Board Layout and Features
MSI includes most of the same features as Gigabyte and ASUS, but the location of some of these features is different. The color scheme is also different, coming in MSI's typical red PCB with a few colorful additions. MSI does save a bit of money by using standard electrolyte capacitors in some areas rather than 100% "solid" conductive polymer aluminum capacitors. We really don't have a problem with this approach, as none of the three boards we're looking at today really target the high-end market.
MSI's rear panel is definitely unique in terms of layout, with four of the USB ports coming on a block that sits above the VGA and Firewire ports. MSI also offers two eSATA ports, at the cost of providing two internal SATA connectors. They use a Marvell chip to power the IDE port as well as an additional SATA port. The remainder of the rear panel consists of the typical PS/2, Firewire, network, and audio connections.
The location and layout of the connectors is generally fine, but the ATX12V is located below the CPU socket, and in most cases it will require looping the cable above the CPU heatsink which is less than ideal. You can also see the blank area on the bottom of the board where two additional SATA ports could have been installed. A single eSATA port would have been sufficient, we think, but perhaps there are people out there that make more use of eSATA than us.
One final concern worth pointing out on the MSI design is that the DIMM slots are not properly color coordinated, at least not relative to the rest of the industry. We generally feel that users like to install DIMMs into the same colored slots for dual channel operation, but MSI chooses to color channel A orange and channel B green. This makes it a lot more difficult to explain which slots should be used for dual channel operation, as you actually have to use the first orange and the first green slot (or the second slots on both channels) to get dual channel memory support. It seems like it would be a lot easier to simply say "install memory into the same colored slots" -- and considering the rest of the industry has taken that approach, we feel MSI should join the club.
MSI chose to include a PCI-E x1 slot in place of the x4 slots on the Gigabyte and ASUS boards, and they have the slot located next to the main x16 connector. That means if users happen to install a dual slot GPU they will not have access to any other PCI-E slots. Whether or not that's a serious problem is up to the individual to decide, but PCI ports are becoming less and less used as time goes on so long term this may not be the best design decision. Clearance between the PEG slot in the memory clips continues to be very tight, and use of a smaller graphics card might be advisable on all three of these motherboards. Anyone looking at an HTPC setup would probably use one of the fanless GeForce 8500/8600 or Radeon 2400/2600 cards, but gamers might find it difficult to install a GeForce 8800 or Radeon HD 2900.
Both chipsets are passively cooled, just like the other two boards. That's a really good design decision for anyone looking at creating a silent HTPC, but then we have to return to the video output options and end up scratching our heads. If it isn't clear yet, we really aren't all that impressed with the G33 chipset. MSI also includes two fan headers, both 4-pin: one for the CPU and one for the chassis. The chassis connector is located at the very bottom, just below the last PCI slot. Fan speeds for both connectors can be individually controlled within the BIOS or via software.
Naturally, MSI also includes a set of software utilities. Other than the MSI Live Monitor, these utilities of course have to have unique appearances with funky shaped windows and odd button locations. Call us old-fashioned, but we really don't have a problem with rectangular application windows that follow the standard Windows look and feel.
MSI includes most of the same features as Gigabyte and ASUS, but the location of some of these features is different. The color scheme is also different, coming in MSI's typical red PCB with a few colorful additions. MSI does save a bit of money by using standard electrolyte capacitors in some areas rather than 100% "solid" conductive polymer aluminum capacitors. We really don't have a problem with this approach, as none of the three boards we're looking at today really target the high-end market.
MSI's rear panel is definitely unique in terms of layout, with four of the USB ports coming on a block that sits above the VGA and Firewire ports. MSI also offers two eSATA ports, at the cost of providing two internal SATA connectors. They use a Marvell chip to power the IDE port as well as an additional SATA port. The remainder of the rear panel consists of the typical PS/2, Firewire, network, and audio connections.
The location and layout of the connectors is generally fine, but the ATX12V is located below the CPU socket, and in most cases it will require looping the cable above the CPU heatsink which is less than ideal. You can also see the blank area on the bottom of the board where two additional SATA ports could have been installed. A single eSATA port would have been sufficient, we think, but perhaps there are people out there that make more use of eSATA than us.
One final concern worth pointing out on the MSI design is that the DIMM slots are not properly color coordinated, at least not relative to the rest of the industry. We generally feel that users like to install DIMMs into the same colored slots for dual channel operation, but MSI chooses to color channel A orange and channel B green. This makes it a lot more difficult to explain which slots should be used for dual channel operation, as you actually have to use the first orange and the first green slot (or the second slots on both channels) to get dual channel memory support. It seems like it would be a lot easier to simply say "install memory into the same colored slots" -- and considering the rest of the industry has taken that approach, we feel MSI should join the club.
Click to enlarge |
MSI chose to include a PCI-E x1 slot in place of the x4 slots on the Gigabyte and ASUS boards, and they have the slot located next to the main x16 connector. That means if users happen to install a dual slot GPU they will not have access to any other PCI-E slots. Whether or not that's a serious problem is up to the individual to decide, but PCI ports are becoming less and less used as time goes on so long term this may not be the best design decision. Clearance between the PEG slot in the memory clips continues to be very tight, and use of a smaller graphics card might be advisable on all three of these motherboards. Anyone looking at an HTPC setup would probably use one of the fanless GeForce 8500/8600 or Radeon 2400/2600 cards, but gamers might find it difficult to install a GeForce 8800 or Radeon HD 2900.
Both chipsets are passively cooled, just like the other two boards. That's a really good design decision for anyone looking at creating a silent HTPC, but then we have to return to the video output options and end up scratching our heads. If it isn't clear yet, we really aren't all that impressed with the G33 chipset. MSI also includes two fan headers, both 4-pin: one for the CPU and one for the chassis. The chassis connector is located at the very bottom, just below the last PCI slot. Fan speeds for both connectors can be individually controlled within the BIOS or via software.
Naturally, MSI also includes a set of software utilities. Other than the MSI Live Monitor, these utilities of course have to have unique appearances with funky shaped windows and odd button locations. Call us old-fashioned, but we really don't have a problem with rectangular application windows that follow the standard Windows look and feel.
26 Comments
View All Comments
tooter2 - Sunday, September 30, 2007 - link
Hi all. I had just ordered the DS2R board when I read your review, and how poor this board overclocked, exceeding fsb of 400, contrary to what I had read elsewhere. I was a bit concerned to say the least. Well, I just spent an hour running the newest memtest86 using this board with an e6750 at 7 X 500 = 3.50 GHz at default vcore using 2 X 1gig of DDR2 6400 GSkill at 5-5-5-15 with vdimm at +.2, and all other settings at default except for the power management settings so as to be sure that I was running at the high speeds. This was with the Intel stock cooler. I've also run memtest at 8 x 463 = 3.70 GHz, default vcore. CPU temp never exceed 38C. And I've used an older Antec Neopower 480 for my psu. I should add that this is with on-board video in a bare-bones setup, i.e., no case, no hdds, ide optical drive. This board appears to be an overclocking monster, not at all like your results.And I plan to use a video card in this board, but I bought it for its mATX size plus the fact that I can get a video card later. I want to see how the new AMD cards pan out, plus what Nvidia comes back with. This will be used in a HTPC setup, but a setup where I can play games as well. Hence, the e6750.
tooter2 - Sunday, September 30, 2007 - link
In the above post, I meant to say "not exceeding 400 fsb".jonp - Saturday, September 29, 2007 - link
The Asus P5K-VM feature set chart shows only 1 PCI when it should say 2.overzealot - Saturday, September 29, 2007 - link
The g33m-ds2r comes with an eSATA expansion slot bracket. It also makes it quite clear that it's supported on the product's site and on the box it comes in.falacy - Friday, September 28, 2007 - link
Something I have noticed over the years is that this site doesn't really take an objective look at the "low end" hardware, from the perspective of those of who would purposely purchase these items - even though we're "tech savy". For instance,though I do agree that the absence of a DVI port isn't great, I find it hard to believe that I'm the only person who is still happily using a 17" CRT monitor at 1024x768 and it's pretty insulting to hear that anything without a DVI port isn't worth looking at. Did everyone forget that CRT monitors have better visual quality that LCDs - unless you're able to shell out far, far more money? I digress...Here is my path to the P5K-VM:
When I moved in 2003, after losing my great job, I had to sell my good computer and when I finally got settled I was far too poor to replace it. That was 2004. Anyhow, I needed a computer so bought a Dell desktop (P4 2.8) and used it until 2005 with an Ati 9550SE graphics card. This was good enough to play Star Wars Galaxies, Everquest, and a whack of other games that I played at the time. Of course, it ran everything office-like too. Later on I was given an ATI 9800XT video card, which was very expensive when it first came out. Anyhow, in 2006 I upgraded to an Asrock board that could handle a Core2, yet still had the AGP slot so I could make use of the 9800XT. At the time, there weren't any cheap Core2 processors, so I bought a P4 531 and it was a decent upgrade from the Dell. All this was awesome (and for the games I played I was happy), until recently when I bought I bought a Pentium Dual-Core 2160 and then was lucky enough to have the fan on my 9800XT fail, which awesomely fried the GPU. Yay. I was back to using the i865G graphics, as had given away my 9950SE and the only other cards in my collection of junk weren't any better than the onboard video.
And this brings me to yesterday, when I set up my new system.
I bought 1GB RAM and a P5K-VM and after testing it out, I found that the graphics capabilities trounce the i865G onboard video, in the practical testing of playing World of Warcraft as well as in 3DMark2000 scores; the G33 is smooth and playable in WoW at 1024x768, where as the i865G was somewhat choppy at 800x600. Also, the apart from playing at 1x AA rather than 4x AA, the G33 scored the same in 3DMark2000 as the old Ati 9550SE that I used to have. And finally, it really isn't that much of a downgrade from the Ati 9800XT (which sucked up so much power even in idle, the air from my PSU went from HOT to cool when I stopped using it!) in World of Warcraft (the only game I really play now). Sure, AA is nice, but I like the electicity/heat/noise savings better. Down the road, I may purchase a fanless PCI-E graphics card if the NEED arises.
All together, I believe a lot more credit should be given to the value of these motherboards. In fact, I have felt since the first onboard video chipsets to offer full AGP support that so long as you're not giving up any important features, it's pretty stupid for the average person to buy a motherboard without onboard video - you never know when you're going to need it! There is a huge list of fun gaming titles that the onboard graphics can play with Playstation 2 quality (or better) graphics quality and I think that this information is lost on the Anandtech crowd. Also, these systems can run with Windows XP and 1GB RAM and be completely amazing in compairison to what was available just two short years ago!
The P5K-VM is a perfect motherboard for a person like me, who has some dispoable income to build a computer over time and enough patience to make that happen. Eventually, I can add 8GB of RAM or a wicked graphics card (if ever I feel like playing more than WoW or Neverwinter Nights) or 4 SATA drives to run software RAID5 (or 4 IDE drives to use with my promise controller) or a camcorder to use the 1394 or a super-mega quad-core, low power consumption CPU.
Seems to me that anyone with a 17" CRT monitor, which often has better visual quality than the crappy LCDs people peddle these days, would be very wise to buy one of these boards now and upgrade as "the itch" and their budget fits!
lopri - Saturday, September 29, 2007 - link
Dunno whether I should laugh or cry over your post. Are you being serious or sarcastic? Sorry it was a long day and I'm not that a sharp person.falacy - Sunday, September 30, 2007 - link
That's exactly what I am talking about: The inability of so many people on Anandtech to see from the "Average Person"'s perspective. Funny enough, it just happens to be that "Average Person" makes up the majority of the computer purchasers in North America.As a person who has managed a "ma & pa" computer store in a small town, I can tell you that even the most inept of "boony noobs" out there has some computer knowledge these days. And, many people still have some decent hardware kicking around that, considering the things they actually USE a computer for, they can squeeze a bit more value out of. Heck, it was just two years ago that replaced an AMSTRAD 200 portable computer with a laptop that ended up frustrating the hell out the customer, because it didn't do all the things here ancient computer did (such as print to her equally ancient printer). In fact, my computer is housed in a modified 486 AT server tower that we took in on trade that was being used as an office server until the we replaced it in 2005. For "Average People" doing average things with average expectations, it's amazing how long a computer can last (Personally, I used my Celeron 300a Malay @450MHz & Abit BH6 Rev2 for over 3 years). Look at it this way, if all you're doing is crunching numbers and typing, my 486SX 25MHz laptop with Word Perfect 5.1 and Lotus 123 will still get the job done (and it will boot faster than anything else out there today).
Anyhow, it may come as a surprise that not everyone has enough money to just buy what ever the heck they want, when ever they feel like it. No, most of us have to set priorities in life - I believe that has something to do with being an adult and/or a parent. Consiquently, "Average People" like me (in wealth, as aposed to computer knowledge) have to wiegh the pros and cons a little more carefully and for someone like myself, I'd rather through some spare money at more storage space for my movies or a camcorder or a better TV to watch said movies on than I would an uber graphics card.
The plain truth of the matter is that the G33 under Windows XP can play fun games like Quake 3, Neverwinter Nights, World of Warcraft, and many other great 1999-2004 titles. All the while, it can do so using that CRT you probably already own, that likely still looks just as good as new and will give you a sharper image at 1024x768 than the low-end LCD you'd likely buy. Finally, a board like the P5K-VM is amazing, because should a person strike it rich they could upgrade the hell out of their computer without ever needing to consider buying a new motherboard - DDR3, 45nm CPU, Gigabit LAN, 8 channel audio? Boy, that seems pretty "bleeding edge" from vantage point on ye o'l interweb!
There's a lot of potential (for the "Average Person") in these boards.
strikeback03 - Tuesday, October 2, 2007 - link
I don't think a CRT can ever give a "sharper" image than an LCD - kinda the nature of the beast with discrete pixels vs a scanning electron beam. Now your CRT probably has better colors and almost certainly has better viewing angles than the average cheap LCD, but is almost certainly not sharper.also, 1024x768 is REALLY small. Even using the internet is cramped, and forget the average programming environment or photo editing program.
finally, it does not appear the P5K-VM supports DDR3. The chipset can, but most motherboard makers are choosing either DDR2 or DDR3, as the slots are different, and they cannot be used simultaneously.
ltcommanderdata - Friday, September 28, 2007 - link
Can we please get a review of the 14.31.1 XP driver for the GMA X3000 that enables hardware DX9.0c SM3.0 acceleration? I know you've switched over to Vista, but the 15.6 driver release notes don't mention that they added hardware acceleration so it looks like only the 14.31 and the newer 14.31.1 XP drivers have it. I would love to see a comparison between the GMA X3000, Xpress X1250, Geforce 7150, and a discrete X1300HM and 8500GT.You're probably saving the new drivers for an IGP review when the G35 GMA X3500 comes out (October 21?), but it would be nice to have numbers for the GMA X3000 too for comparison.
IntelUser2000 - Tuesday, October 2, 2007 - link
I agree, they should run XP driver tests. Better yet, they should test out G965 to see the taste of G35.
Here's my results:
From Gary-"We set our quality settings to medium or low where applicable except for the first two are set to high and the sliders are set in the middle spot."
With that in mind, I did a test. However I wasn't sure whether Object Scarring and Post processing were on or off. I did both tests.
AT settings+Object Scarring/Post Processing Off-12.5
AT settings+Object Scarring/Post Processing On-11.6
I also use Dual Channel DDR2-800 with 5-5-5-15 ram and E6600. I found out that in Company of Heroes, performance increased by 10% going from 5-6-6-18 to 5-5-5-15.
Supreme Commander: 8.381
I am using 14.31.1 driver and XP SP2.