Original Link: https://www.anandtech.com/show/721

ATI Radeon VE 32MB

by Matthew Witheiler on February 19, 2001 3:09 AM EST


The evolution of the high performance, budget video card is occurring at breakneck speed. It was only eight months ago that the world was introduced to arguably the first performance oriented budget video card, NVIDIA's GeForce2 MX. Since the GeForce2 MX's release, other video card companies have been following NVIDIA's lead, introducing versions of their own products that provided the maximum bang for the buck. The first to follow the GeForce2 MX's success was ATI's Radeon SDR 32MB card, arriving three months after the GeForce2 MX. Providing slightly better performance than the comparably priced GeForce2 MX, ATI had a winning card on its hands, if it were not for the card's driver problems, comparatively high price, and lack of dual monitor support, a feature offered on the GeForce2 MX.

The next product to make its way into this ever expanding market was 3dfx's Voodoo4 4500 video card. Lacking severely in the performance department, the Voodoo4 4500 fell to the bottom of our performance graphs in nearly every test. The card also featured a relatively high price of $150 upon its release.. We concluded that if the Voodoo4 4500 had been launched much earlier, or at a lower price, it may have been met with success, but there was no way that it was going to reach that same level of success given the current competition.

Although time has provided a significant price drop for the Radeon SDR card, ATI felt there was still one feature left to attack. In many cases the Radeon SDR is priced at the same level as quite a variety of GeForce2 MX based cards, at $99 after a mail in rebate, however ATI had yet to add dual monitor support to any Radeon based product. That all changes today with the release of the $99 Radeon VE, a card that ATI is targeting at the workstation by day, game computer by night market.



The Chip

Unlike previous Radeon incarnations, the Radeon VE core only shares some features that other Radeon cores have. Let's see what ATI has chosen to drop in order to bring dual display capability to a Radeon powered card.

The core remains based on a 0.18 micron manufacturing process, meaning that ATI is able to sell the card running at a 183MHz core like other retail Radeon products. The core also continues to include ATI's memory bandwidth saving HyperZ technology. We already discussed HyperZ in depth, but in summary the technology essentially decreases the amount of data that must travel over the already crowded memory bus. HyperZ proved to give a big push in the case of the Radeon SDR, as the memory pipeline was half the size of the already memory bandwidth limited Radeon DDR. The technology will play even a larger role in the case of the Radeon VE, but more on that in a minute.

The chip continues to include support for three bump mapping techniques (emboss, dot product 3, and environment mapped bump mapping) as well as support for inverse discrete cosine transform, or iDCT for short. iDCT has been a specialty of ATI for quite some time now, and has gained them quite a following in the home video crowd. As we showed in our DVD Roundup, iDCT is able to significantly reduce the amount of work that a CPU has to do when performing DVD playback.

Aside from the aforementioned features, the Radeon VE shares little in common with the higher performing Radeon chips, securing its spot as a true budget card. First, and perhaps most importantly, is the deletion of one of the Radeon's two rendering pipelines. You may recall that the original Radeon chips possess two rendering pipelines capable of processing 3 texels per clock for a total of 6 texels per cycle. The Radeon VE core has a decreased total output of 3 texels per clock due to its single rendering pipeline. In actuality, however, most games will only be able to take advantage of 2 of the Radeon VE's 3 rendering pipelines. This is due to the fact that the vast majority of games out there use only dual textures, leaving the 3rd texel slot unused. Upon the release of the Radeon DDR, we mentioned that this feature would be nice in the long run as games switch to using three textures per pixel, however this transition seems to be taking longer than anticipated. In contrast, the GeForce2 MX can render 4 texels per clock when in a dual texture game and 3 texels per clock when in a trillinear textured game.

The second thing that the Radeon VE looses when compared to the rest of the Radeon family the 128-bit wide memory bus. In the Radeon VE, the memory bus width is shortened to only 64-bit wide and is used with DDR memory. This gives the Radeon VE effectively the same memory bandwidth as the 128-bit wide SDR memory Radeon SDR card. This, however, is half the memory bandwidth offered on the Radeon DDR card, with its 128-bit wide DDR memory bus.

Thirdly, the RAMDAC on the Radeon VE is actually crippled when compared to the other Radeon cards. Many praise the Radeon DDR, SDR, and All-in-Wonder cards for their superior 2D image quality. This is most likely due to the fact that these cards use ATI's updated 360MHz RAMDAC to power 2D image output. The Radeon VE, on the other hand, uses a downgraded 300MHz RAMDAC. In our subjective tests, the Radeon VE possessed about the same image sharpness as the GeForce2 series cards, which many say is less crisp than the 2D output of the aforementioned Radeon cards.

Finally, the Radeon VE's core lacks a technology that ATI make quite a scene about upon the release of the Radeon DDR: the Charisma engine. The Charisma engine was ATI's T&L engine that allowed transforming, lighting, and clipping calculations to be performed on chip, reducing CPU utilization. It is a shame that the budget oriented Radeon VE does not possess these features, as this card aimed at lower speed CPUs that will take any break they can get. Then again, in actual game play the single 3 texel pipeline will hold the Radeon VE back long before T&L could begin to make a difference.

The most obvious reason for cutting the Charisma engine out of the chip is the same reason that one of the texture pipelines was cut from the chip: to reduce die size. How much space was saved by cutting these components out is unknown, as ATI would not give us a number. It can be estimated, however, that some significant space was saved, as there was no reason to do so otherwise.

By looking at the Radeon VE's features, you may notice that it shares many things in common with another recently announced ATI product: the Radeon Mobility. Although we were just able to comment on some what the Radeon Mobility is about in our preview last week, the Radeon VE's specs look almost identical to those of the Radeon Mobility, sans the power management technology. Although no notebooks take advantage of the Radeon Mobility yet, the Radeon VE should give us somewhat of a vague glimpse of how the Radeon VE should perform in some instances.



The Card

The Radeon VE looks strikingly similar to the Radeon SDR, most likely due to the fact that the two appear to be made on merely the same PCB. In contrast to the faster performing Radeon SDR, as well as other Radeon cards, the Radeon VE comes without a fan, only a heatsink. As we mentioned in our original Radeon DDR review, it seems that any Radeon core running at 183 MHz does not really even need a fan: the fan is most likely only added for aesthetic value, as people associate chips with fans on them as fast. In order to save costs even further, the Radeon VE skips the whole fan and uses just a standard heatsink. The heatsink does get hot to the touch, however it is unlikely that even this heatsink is necessary. It is attached via a layer of thermal glue.

The DVI-I port as well as S-video out port will come standard on every retail Radeon VE card sold. ATI includes these ports so that the user can take advantage of their HydraVision multiple monitor support (more on this later). Also included in every package is a Genic DVI-I to VGA converter that allows the card's DVI port to power any standard 15-pin monitor. Unlike some GeForce2 MX cards that claim TwinView support, users buying the Radeon VE can rest assured that the card will work with nearly every monitor they have, as well as every television with an S-video or composite input port. The DVI-I port is powered by the Radeon VE's internal TMDS transmitter.

Our Radeon VE came outfitted with 32MB of 5.5ns Hyundai RAM running at 183MHz, right at the spec of the memory. The card also supports 64MB and 16MB configurations, with the appearance of 16MB cards much more likely than 64MB ones, at least on the retail market. ATI has down OEM multipack versions of both a VGA only (no multi-monitor support) 16MB card, as well as a 64MB version of the card we are looking at today.

We were speaking about how much space removing both the Charisma engine as well as one of the rendering pipelines saved. By looking at the back of both the Radeon VE as well as the back of the Radeon SDR, we can take a guess (note that the pictures are to scale)..

Radeon VE Core
Radeon SDR Core

The answer seems to be a lot. Estimating die size by the rectangle of transistors on the back of the cards directly behind the core, you can see how much smaller the Radeon VE's core seems to be. This means big money savings for ATI.

Before we go on, let's take a look and see how the Radeon VE's numbers compare to the competition.

Video Card Specification Comparison
 
ATI Radeon VE
ATI Radeon SDR
NVIDIA GeForce2 MX
Matrox G450
3dfx Voodoo4 4500
Core
Rage6C
NV11
Napalm (VSA-100)

Clock Speed

183MHz
183MHz
175MHz
125MHz
166MHz
Number of Chips
1
1
1
1
1
Rendering Pipelines
1
2
2
2
2
Texels/Clock
3
3
2
1
1
Texels/Second
549 Million
1100 Million
700 Million
250 Million
333 Million
Memory Bus
64-bit DDR
128-bit SDR/DDR
128-bit SDR or 64-bit SDR/DDR
64-bit DDR
128-bit SDR
Memory Clock
183MHz DDR (366 MHz)
183 MHz SDR
166MHz SDR
166MHz DDR
166MHz SDR
Memory Bandwidth
2.9 GB/s
2.9 GB/s
2.7 GB/s
2.7 GB/s
2.7 GB/s
Manufacturing Process
0.18-micron
0.25-micron (Enhanced)


Multiple Monitor ATI's Way

Perhaps the main selling point for the Radeon VE is its multiple monitor support provided via Appian's HydraVision technology, meant to battle both the Matrox G450 as well as the GeForce2 MX head on.

ATI turned to Appian for their multi-monitor functions, a company that has been providing multiple monitor solutions in the workstation market for quite some time now, and as such they already had many kinks out of the system when ATI approached them. Items such as correctly supporting dialogue boxes, zooming in on different parts of the screen, and software support had already been taken care of.

As mentioned before, ATI includes both an S-video port as well as a DVI-I port on the Radeon VE to allow for multi-display support. The two can not be used at once, meaning that only the standard VGA port in combination with the DVI-I port or the standard VGA port in combination with the S-video out port can be used. Lets see some of the things that ATI's HydraVision can do.

As you can see, the setup of the dual display is taken care of in Windows itself, making it easy to set resolutions and colors for each individual monitor. All one has to do is click on the appropriate monitor and set it to the desired settings. This is the way that all multi-monitor solutions work.

Going further into the display properties page, one finds a list of attached devices as well as the settings for each. This is something rather unique to ATI, as they allow on the fly changing of the primary and secondary displays (which one shows up on the right or left and which one gets hardware acceleration in OpenGL and D3D). In addition, different refresh rates can be set to different monitors. So, as you can see, the primary display in our tests was running at 60 Hz, while the secondary one was running at 100 Hz.

Another feature unique to ATI is the integration of a maximize to both screens button. Found conveniently located to the left of the minimize button, this maximize button serves to maximize the current application to the full length of both screens.

On the taskbar, one will find both the regular ATI icon we have become accustomed to as well as two new buttons. The first one, a rectangular red ATI icon, brings up a menu of HydraVision's most commonly used features. It allows you to launch the desktop manager configuration screen as well as easily manage windows.

Clicking on the desktop manager configuration text brings up the configuration screen. Here you can access the advanced features of ATI's HydraVision software. The first upper left rectangle allows the user to select where on the extended desktop he or she would like dialogue boxes to be placed. For example, if you run an e-mail program in the background, you can have a new mail notification dialogue box sent to which ever screen you are currently using.

The box in the upper right of the screen allows the user to set the properties of the added maximize button, allowing choosing of what happens when the button is clicked.

Finally, the general tab allows the user to set applications to run in the location they were last run at (a very useful feature). Here one can also set the properties for individual applications, shown on the screen below.

Each individual application can be setup to have different properties, a feature that proves very useful in application. For example, every time you run a program like Internet Explorer, you can have it automatically go to what ever screen you want. This provides an easy way to override the application position memory setting.

The final screen in the HydraVision software allows hotkeys to be set for all types of functions. This provides easy access to some of the more commonly used features of the software and saves quite a few mouse clicks.

The second new icon one will find residing in the taskbar is ATI's MultiDesk feature. This icon provides access to numerous desktops by a click of the mouse of a punch of hotkeys. Each desktop can be configured individually and can be renamed to ones liking. A good example of this feature in use would be to have one desktop called "games" that included shortcuts to common gaming programs and another desktop named "work" that had an office suite and a shortcut bar configured on it.

Each desktop may be configured to the user's liking, and applications can be forced to be displayed on each desktop.

HydraVision is a very solid multiple desktop solution and should not be ruled out by anyone looking for a dual monitor solution. AnandTech is currently in the works of making a multiple monitor comparison, so look for that in the near future. There we will cover each multiple monitor solution currently on the market, investigate any shortcomings to each software package, detail what situations are ideal for a multiple monitor setup, and finally help you make your decision on which multiple monitor card is best suited for you. Look for that article to come very soon.



The Drivers

Besides the changes detailed in the Multiple Monitor section of this review, ATI's Radeon drivers remain unchanged for the Radeon VE. This, unfortunately lead to many problems.

First off, as we have noted many times in the past, ATI chooses to disable full 32-bit rendering in both Direct3D as well as OpenGL by incorporating some tricks. Let's take a look at what is going on in the default OpenGL screen first off.

The drivers default to "Performance" mode where a very crucial item is clicked: the "Convert 32 bit textures to 16 bit" box. With this check block there, all Radeon cards force games to render 32-bit textures in only 16-bit color, decreasing the image quality of the game. Why would ATI do this? It is because by converting the textures to a lower quality, speed is increased. No other card manufacturer that we know of sets this setting as default in OpenGL. Both NVIDIA as well as 3dfx leave this feature disabled upon the install of the drivers. For this reason, the first thing that we do when testing cards that use the Radeon driver set is to unchecked this box and put the Radeon cards in the same boat as the rest of the products out there.

The second form of trickery comes in D3D mode. Let's see what is default here.

The highlighted "16;24" selection in the drop down menu for Z-buffer bit depths is the default selection for Radeon based cards. Once again, image quality is compromised with this setting chosen because the Z-buffer can not store the proper amount of information necessary to render a game in full 32-bit color. The trickery once again results in a performance increase for the Radeon when compared to NVIDIA based cards that allow 16, 24, and 32 bit Z-buffer bit depths. 3dfx and Matrox also play this trick upon the consumer, and features the same 16 and 24 bit default settings that the Radeon has. The second thing we do when setting up a system with Radeon based drivers is select the "16;24;32" selection from the drop down menu, allowing proper 32-bit rendering.

The reason we bring this up is because some out there may be testing without having a level playing field. Without unchecking the convert 32 bit textures to 16 bit box and selecting the 16;24;32 as allowed Z-buffer depths, you are giving the Radeon an unfair advantage (as long as you select the 16;24;32 Z-buffer depths in 3dfx and Matrox products as well). It is a shame that ATI does this, as it only ends up hurting the consumer by making games play faster but in lesser quality. However, recently it seems that all that consumers are looking for is speed. If that is the case, then at least the speed comparison needs to be performed without one product having an advantage over another.



The Test

Windows 98 SE Test System

Hardware

CPU(s) AMD Athlon (Thunderbird) 1.1GHz
Motherboard(s) ASUS A7V
Memory 128MB PC133 Corsair SDRAM (Micron -7E Chips)
Hard Drive

IBM Deskstar DPTA-372050 20.5GB 7200 RPM Ultra ATA 66

CDROM

Phillips 48X

Video Card(s)

3dfx Voodoo4 4500 AGP 32MB

ATI Radeon 32MB SDR
ATI Radeon VE 32MB

Matrox G450

NVIDIA GeForce2 MX 32MB SDR

Ethernet

Linksys LNE100TX 100Mbit PCI Ethernet Adapter

Software

Operating System

Windows 98 SE

Video Drivers

3dfx Voodoo4 4500 AGP 32MB - v1.04.00

ATI Radeon 32MB SDR - 4.13.7072
ATI Radeon VE 32MB - 4.13.7072

Matrox G450 - 6.24.007

NVIDIA GeForce2 MX 32MB SDR - Detonator3 6.50

Benchmarking Applications

Gaming

idSoftware Quake III Arena demo001.dm3
MDK2 Demo
GT Interactive Unreal Tournament 4.32 Reverend's Thunder Demo



Quake III Arena Performance

Already, at 640x480x32, we see how the limited the Radeon VE's core makes it. When compared to its full featured brother, the Radeon SDR, we find that the Radeon VE performs 32% slower. With its single rendering pipeline and lack of T&L, the Radeon VE performs more like a Voodoo4 4500 than it does the GeForce2 MX. The card, however, easily dominates the Matrox G450 by 80%, making the Radeon VE look much more attractive when it comes to dual monitor support.

Once again, we find the Radeon VE performing on the same level as the Voodoo4 4500, about 36% slower than the GeForce2 MX. The Radeon SDR, which was able to reach the top of the graph at this resolution, shows how much of a hit the decrease in the Radeon VE's core resulted in. Still, the Radeon VE is able to outperform the Matrox G450 by 91%.

1600x1200x32 is a resolution that most with budget video cards should stay away from. This is no exception for the Radeon VE, which once again comes very close in performance to the Voodoo4 4500.



MDK2 Performance

Although the Radeon VE still falls to the bottom of the graph in MDK2 at 640x480x32, its nearest competitor has changed. Now the Radeon VE is closest in performance to the Matrox G450, leaving the Voodoo4 4500 behind. The GeForce2 MX easily dominates the Radeon VE by a huge 82%.

At 1024x768x32, the Radeon VE actually saves some face, performing within 18% of its noncrippled counterpart. In fact, the GeForce2 MX is able to out perform the Radeon VE by "only" 28%. This puts the Radeon VE in much better light, performing not as far off from the competition. In fact, the the Matrox G450 proved to be much slower than the Radeon VE at this resolution in MDK2.

Finally, at 1600x1200x32, all cards perform rather poorly. The Radeon SDR and the GeForce2 MX are able to provide 24 frames per second, however the Radeon VE falls 47% behind at 16 frames per second. Once again, the Radeon VE slightly outperforms the G450 at this resolution.



UnrealTournament Performance

The minimum framerates across the board are very similar, with the exception of the G450 which lags behind. The Radeon VE performs nearly identically to the Radeon SDR in UnrealTournament, showing the limitation of the Unreal engine

Once again, the Radeon VE performs nearly identically to the Radeon SDR. This shows how the Unreal engine really does not stress the core of a graphics processor, but rather other parts such as the memory subsystem. It is for this reason that the Radeon VE can come so close to the Radeon SDR, with the GeForce2 MX having only a slight lead.

At 1024x768x32, the game becomes a bit more intensive. No longer can the Radeon VE perform at the same rate as the more powerful Radeon SDR when it comes to the minimum framerate produced. Lets see how the average performance of the cards at this resolution is.

The average framerates show, once again, that processing power is not what Unreal Tournament takes advantage of . It is more likely that the memory system of the video cards is being stressed, resulting in similar performance for the top three performers.

Finally, at 1600x1200, the cards only differ by a few point in the minimum framerate measure.

The average framerate of the cards, however, shows that the Radeon SDR far outperforms the Radeon VE, showing that at this high of a resolution video processing power does being to matter. No longer is the memory subsystem and CPU of the computer being pressed, but the video card becomes the bottleneck.



16-bit vs 32-bit Performance

As we have seen in the past, the Radeon cards do not loose as much performance when making the switch from 16-bit color to 32-bit color. The performance decrease in the Radeon VE from 1024x768x16 to 1024x768x32 was about 17%. In contrast, the GeForce2 MX dropped 49% when making the resolution switch.

The Radeon VE going from 1600x1200x16 to 1600x1200x32 dropped 25% in performance, while the GeForce2 dropped 78%. Although the margin of difference between 32-bit color and 16-bit color is attractive because of the lack of performance decrease when switching resolution to one of higher color, it is not as good in the long run. Since falling back on 16-bit color will not provide as much of a performance boost in the case of the Radeon VE when compared to the GeForce2 MX, future games may have some trouble running up to par on this card.



CPU Scaling Performance

It is clear that the Radeon VE likes high performance machines. The Radeon VE performs best in the Pentium III 800 MHz range and above, leaving the Duron 750 MHz and below processors running a bit slow.

The performance gain going from the Duron 750 MHz to the Pentium III 800 MHz was 29%. In contrast, going up 200 MHz from the Pentium III 800 MHz to the Pentium III 1 GHz provided only 6% increase in speed. It is not that the Radeon VE dislikes the Duron platform, as the card performed fine on both our Athlon-C 1 GHz as well as Thunderbird 1.1 GHz systems (used in the tests), it is just that the Radeon VE likes a high powered system.

This can likely be attributed to the Radeon VE's lack of T&L. Cards without a T&L component are forced to turn to the CPU for these function. As a result, CPU power is consumer much more readily in systems without a T&L capable card.



Windows 2000 Driver Performance

For this section we will have to ask you to wait until tomorrow, where we will feature an article investigating driver performance of various cards in Windows98 as well as Windows 2000. We will investigate how the Radeon drivers perform in a variety of situations in Windows 2000 and also see how these drivers compare with their NVIDIA counterparts. Be sure to check back with us tomorrow.

Conclusion

The Radeon VE is not ATI's first stab at the budget video card market, the Radeon SDR gets that claim to fame. The Radeon VE does attack a new market, however, being the first to bring full featured dual monitor support to the Radeon product line. It is likely that this feature will be passed down to future Radeon products as time new ones come out. But, as of now, the Radeon VE is the only ATI product to offer the dual display feature. How does it fare in comparison to the other multi-monitor video cards out there, NVIDIA's GeForce2 MX and Matrox's G450?

Well, when it comes to performance, the GeForce2 MX wins hands down. The Radeon VE can not keep up with the MX's T&L engine and dual pipelines. Even though the Radeon VE is clocked slightly higher than the GeForce2 MX, it can not make up for its deficits.

In terms of dual monitor support, it seems that there may be some better options out there. The Radeon VE's HydraVision software is very solid and easy to use, but it may not be the best. We will find out in a few days when we release our full multi-monitor comparison.

The item holding the Radeon VE back is the thing that has been the achilles heal for the whole Radeon series. Although Appian's software is great, the ATI drivers still contain problems. When performing our tests for the review that will be going up detailing Windows 2000 performance, we ran into a multitude of problems that will be discussed soon..

The price of the Radeon VE is right. At $99 after a $20 mail in rebate, the card provides dual display capabilities at the lowest price out there. However, with the GeForce2 MX also costing around $100 and the Radeon SDR at $99, the Radeon VE has quite some competition, even with another ATI product. One has to wonder if now was the best time to introduce this card and if the extremely budget oriented Radeon VE was the product to do so on. For some the answer will be yes, as the low price of the Radeon VE provides excellent mult-monitor support for a powerful workstation. For many, however, the answer will be no. Many will prefer to wait for a higher performance multi-monitor solution from ATI, or perhaps to go with the GeForce2 MX so that their workstation can be a truly powerful gaming system as well.

Log in

Don't have an account? Sign up now