| about us | advertise | careers | links |

[an error occurred while processing this directive]

The MX way!

The big question. If you have already read some other MX reviews, you might have already noticed that overall, its performance wonít disappoint you in anyway, especially for its Average 120$+ USD Price Tag.
As already mentioned earlier, the GeForce2 MX (NV11) chipset is based on the exact same core as its bigger brother, the GeForce2 GTS. Nvidia had to eliminate some of GTSís main features in order to be able to produce a quality 3D accelerator that could compete, and offer superior quality when compared to the competition.

The original GTS chipset features four rendering pipelines whereas the MX comes with a total of two, thus lowering its fill rate by anything of 50% in most 3D demanding applications. Because the MX chipset is based on the same care as the GTS, its two pipelines are also able to process two textures for each one in a single clock. As already stated earlier, the MX chipset just like the GTS is built on a more advanced process, 18. Thus lowering its power consumption (4W+) by 50% compared to the GeForce GTS, and by one-forth of the power of the GeFoce core that was built on the .22 process and was consuming over 16W of power. At the same time, Nvidia decided to lower the original 200MHz-core bus of the GTS to 175MHz for the MX. Because of this, the MX chipset was able to run without having a heatsink attached and this, saving production costs. You will find out a little bit later that Leadtek, like several other video manufacturers producing MX based cards decided to include a heatsink. This for nothing else than for overclocking the board to reach the extra performance to use this 150$ product to its full potential.

The MX core it self, running at stock speed, is rated at 350 megapixels per second, that is over 50% lower than the GeForceís GTS 800 megapixels per second bandwidth.

Memory used on high-end video cards is one of the most important facts to watch for. Moreover, a good example is to compare benchmark numbers of the original GEFORCE SDR with the DDR model featuring the same amount of memory. The results will be without questions very variable under high resolutions and high color depths. Imagine how a GTS card would perform with SDRAM memory? Very poorly indeed. Because the MX chipset is focused for the low cost market, do not expect anything special, and DDR memory will be likely on question. After speaking with Nvidia last month, DDR memory might be used on upcoming MX cards, but there will be a limit. As you might have already heard, the GeForce MX supports the 64-bit SDR/DDR SDRAM and/or 128-bit SDRAM. Most available boards on the market today, are 128-bit SDR solutions, but it seems like we might see another variation of them later this year. We are talking of the ones that will be using a 64-bit memory bus. According to Nvidia, the 64bit versions should be based on DDR memory. In anyway, I simply donít see a point on why Nvidia would even think of using a 64-bit SDR bus, as its performance would be close to awful, and will end up being a very little improvement over the original GeForce 256 solution

The stock speed of the 128-bit SDR version is officially rated at 166 MHz, resulting in almost the same memory bandwidth as the original GeForce 256 SDR solution. The bandwidth it self is rated at around 2.7GB/s compared to the 5.3GB/s bandwidth offered by the GTS card featuring of course the expensive and fast DDR memory.

Web Target PC


[an error occurred while processing this directive]

Contact us | About us | Advertise
Copyright © 1999-2007 TargetPC.com. All rights reserved. Privacy information.