Login Registrieren
Games World
  • Geforce G210: Nvidia's first DirectX 10.1 card reviewed

    Nvidia's 40 nanometer generation has to show what it is capable off. PC Games Hardware tests if switching to 40 nm, DirectX 10.1 and Co. was worth the effort.

    It took some time before Nvidia released the first graphics cards that support DirectX 10.1, but with the transition to TSMC's 40 nanometer architecture the Geforce array is now, almost two years after AMD's Radeon family, made compatible to the (currently) up-to-date API.

    For Nvidia the upgrade from DirectX 10 to DirectX 10.1 still isn't really important. Not very surprising since only the new GPUs and some editions for OEM partners can make full use of the features offered by the API. The high-end desktop graphics cards of the Geforce GTX series are still fully DirectX 10.0 only.

    Perhaps this is also the reason why Nvidia is uncommonly tightfisted in matter of product samples and even technical details. Therefore all information delivered in this article is based on what can be found on the Nvidia website, test that we ran ourselves and general conclusions.

    Geforce G210: Technical Details
    We got our test sample, a Geforce G210, from an auction on Ebay. Initially this card had been part of a HP PC system and has been produced by the OEM company Pegatron. The Geforce G210 is the smallest version of the new series, which is based on the GT218 chip. In the table below we compare the technical details to those of the Geforce 9400 GT as well as the Radeon HD 3470 and HD 4350.

    Graphics chipGeforce G210 (Pegatron) Geforce 9400 GT Radeon HD 4350 Radeon HD 3470
    Chip/Process GT218 / 40nm G96 / 55nm RV710 / 55nm RV635 / 55nm
    DirectX Version 4.14.0 4.14.1
    Coe frequency (MHz) 589550600800
    Shader ALU frequency (MHz) 14021350600800
    Memory frequency (MHz) 499400500950
    Memory size (MiByte) 512512256 - 512 256 - 1,024
    Memory interface (Bit) 641286464
    Type of memory DDR2 DDR / DDR2 DDR / DDR2 GDDR3
    MAD (GFLOPS) 44.86 43.2 9664
    Texture fillrate (MTex/s) 4,7124,4004,8003,200
    Memory bandwidth (MByte/s) 7.98412.8815.2
    Multi-GPU unknownSLI Crossfire-X Crossfire-X
    PCI-E. (6-Pin/8-Pin) 0 x / 0 x 0 x / 0 x 0 x / 0 x 0 x / 0 x
    Length PCB (mm, ca.) 168168168168
    Slots blocked1111

    GT218 Chip RulerGT218 Chip RulerQuelle: PC Games HardwareOf course we also took a look at the "naked” card without the cooler: The graphics chip is the smallest that has ever been measured in the PC Games Hardware Test Lab: 57.3 square millimeters - unfortunately there is no information about the number of transistors. This is 20 square millimeters less than the RV710 (55 nm) of the HD 4350. This only leaves the question if Nvidia was able to squeeze enough power into this little chip.

    Gforce G210: Cooling Loudness and Power Consumption
    The card itself is quite small - the board is only 168 millimeters long and the height would also fit for a low-profile card if there would be a suitable slot cover (this is of course possible for retail cards). But our Geforce G210 is a full height version since it offers Dual Link DVI, D-Sub VGA and HDMI output.

    An active single-slot cooler with a 45 millimeter fan is responsible for the temperature regulation. The warm air is blown in direction of the (closed) slot cover and thus passes the capacitor array of the G210 - this probably doesn't really increase the lifetime of the components.

    Below you can see a chart that lists our results for temperature, loudness and power consumption tests. For comparison we added the results of a passively cooled Radeon HD 4350 from MSI.

    ResultsGeforce G210 (Pegatron OEM) Radeon HD 4350 (MSI)
    Loudness Idle 0.5 / 25.1 (Sone/db(A)) 0 (Sone/db(A), passive)
    Temperature Idle 42 degrees Celsius n.a.
    Power consumption Idle 8.7 Watt 7.2 Watt
    Loudness Race Driver Grid 1.3 / 32.1 (Sone/db(A)) 0 (Sone/db(A), passive)
    Temperature Race Driver Grid 61 degrees Celsius n.a.
    Power consumption Race Driver Grid 19.5 Watt 16.3 Watt
    Loudness Furmark 2.8 / 42.5 (Sone/db(A)) 0 (Sone/db(A), passive)
    Temperature Furmark 65 degrees Celsius n.a.
    Power consumption Furmark 21.4 Watt 17.6 Watt

    During our tests we noticed an interesting behavior of the Idle Mode: Besides the full 3D frequencies GPU-Z reveals two additional levels: 405/405 MHz for the core and the VRAM as well as 135/135 MHz for the sleep mode. But apparently the RAMDAC frequency is linked to the clock speed of the chip core: As soon as DVI Dual Link speed (two transmitters with 165 MHz each) is applied - like necessary for resolution higher than 1,920 x 1,200 pixels - the frequencies aren't decreased below 405/405 MHz and the 135 MHz level is ignored. The same phenomenon was noticed in combination with a (Dual Link DVI) 30 inch display as soon as the resolution was higher than 1,280 x 800 pixels. With a 24 inch display at 1,920 x 1,200 on the other hand the lowest frequency level was possible though. So all in all, this means:
    • 30 inch LCD with Dual Link DVI connection and resolution bigger than 1,280 x 800: 405/405 MHz
    • 24 inch LCD with Single Link DVI connection and resolution up to 1,920 x 1,200: 135/135 MHz
    • CRT monitor with analog connection and resolution up to 1,920 x 1,200 @ 60 Hz: 135/135 MHz; at higher resolutions or refresh rates for 1,920 x 1,200: 405/405 MHz.

    However this has only little effect on the Idle power consumption: In comparison to the value listed in the chart above the power consumption at 405/405 MHz is only 0.5 watt higher and thus lies almost within measuring tolerance.

    The bug of our test sample's fan control was much more annoying though: As soon as the graphics card was stressed and the fine tuned control increased the fan speed from the 24 percent in idle mode, the speed wasn't reduced anymore until the whole system was rebooted.

    Geforce G210: Benchmarks
    Given the intention for the absolute entry-level market we decide to adjust our game benchmarks a little. We use older games without any modifications and run them at 1,280 pixels without Anti aliasing or Anisotropic Filtering. In order to ensure comparability to our normal benchmarks we still use maximal graphics settings and our usual test system, even if this setup might seem inappropriately fast for the G210 and the HD 4350 we used for comparison. Until those graphics cards are slowed down by the processor, we probably would have to use one of the smallest currently available dual core CPUs.

    The HD 4350 beats the Geforce G210 in all tests. The bigger the requirements in Shader performance are the bigger the difference between the two competitors - even if the 20 fps limit is only exceeded in two cases.

    Geforce G210: DirectX 10.1 benefit?
    Of course we also wanted to know if Nvidia's cards benefit in the same way from DirectX 10.1 as AMD's Radeon models do. By now there are several games that utilize DirectX 10.1. Among them are Battleforge and Tom Clancy's HAWX. Only in Battleforge we were able to record a DirectX 10.1 benefit - a typical problem for this performance level: We already were able to prove that slower graphics cards receive a smaller benefit from DirectX 10.1 than current high-end cards.

    Although HAWX activated the DirectX 10.a mode there hadn't been a recordable performance benefit. Therefore we decided to run the integrated Battleforge benchmark at low details (Shader: Medium), but activated 2x MSAA and SSAO - both features benefit from DirectX 10.1.

    Here the Radeon also beats the Geforce G210. But both cards benefit from DirectX 10.1: the Radeon runs about 7 percent faster and the Geforce even about 10 percent. So Nvidia's "Mission DirectX 10.1” was successful, but given the actual framerate in the games this is only a theoretical benefit.

    Geforce G210 (Pegatron): Conclusion
    The graphics chip, which isn't intended for gaming, can't convince in all sectors. Although a superior gaming performance wasn't to be expected with the offered technical specifications, the HD 4350 was faster in the benchmarks. Anyway, both cards can only be used for older games or to accelerate the modern user interfaces of current operating systems. Video fans might also be interested in the HD acceleration of the cards. The Physx advantage many Geforces have in comparison to the Radeons doesn't come into effect here: Only 16 Shader units are too slow to enable GPU Physx in the drivers - the option is just missing.

    The technology transition to DirectX 10.1 on the other hand convinces: Like the Radeon the Geforce benefits from Microsoft's API. However the realization of the 40 nanometer technology could have been done better -at least as far as our test sample is concerned: Active cooling, that gets annoyingly loud in some cases, is not necessary for a 22 watt chip. We also couldn't record a noticeable advantage in matter of power consumption in comparison to the 55 nanometer Radeon cards.

    Thus the G210 is all in all a card that might make primarily Nvidia happy. With a chip size of less than 60 square millimeters low prices and lots of units might be possible - at least until the OEMs want to have DirectX 11 cards. If you want to upgrade your system or want a card for video playback and don't mind the gaming performance at all, you might also want to consider the Geforce G210 - but you should look for a passively cooled version.

    Wissenswert: Mehr Informationen zum Thema finden Sie in:
    Grafikkarten-Rangliste 2017: 30 Radeon- und Geforce-GPUs im Benchmarkvergleich [März]
  • Print / Abo
    PC Games Hardware 05/2017 PC Games 05/2017 PC Games MMore 05/2017 play³ 05/2017 Games Aktuell 05/2017 buffed 12/2016 XBG Games 04/2017
    PCGH Magazin 05/2017 PC Games 05/2017 PC Games MMORE Computec Kiosk On the Run! Birdies Run
Geforce G210 (40nm)
Nvidia's 40 nanometer generation has to show what it is capable off. PC Games Hardware tests if switching to 40 nm, DirectX 10.1 and Co. was worth the effort.
Nvidia, Geforce, G210, GT218, DirectX 10.1, Graphics Card, Benchmark, Review, Test