Worst Game Graphics Cards – Trident 3DImàge 9750

July 30, 2019 posted by

Welcome back to the 3rd episode of Worst Gaming
Cards. Today we’re gonna take a look at the first 3D accelerator from Trident named 3DImage
9750. This card arrived in the market later than
competitive products and for a much lower price. So now we’re going to find out why
it didn’t become a popular, low cost alternative. But first, as before, let’s talk a bit about
the company’s history. Trident Microsystems was a Californian company founded in 1987.
Trident’s market focus were cheap desktop and notebook graphics chips as clearly indicated
by the company’s slogan: “Digital Media for the Masses“. The mysterious VGA chip named TVGA8200LX was
probably Trident’s first product. Sadly, it looks like not even a single specimen survived
till this day, because we couldn’t find a single photo of this card or chip. So if
you have one, please upload a photo of your card somewhere, or knowledge of this card
will be lost forever. In 1988, the now also quite rare TVGA8800BR
followed suit with a maximum supported 512kB of memory. In the next year, these chips were replaced
by the more common TVGA8800CS used on cards with support for legacy EGA monitors. Between the years 1990 and 1992 many submodels
of the SVGA chip named TVGA8900 were released supporting a maximum of 1MB of memory. Also, in 1991, the ultra low cost TVGA9000
line of chips appeared supporting a maximum of only half a meg of memory. This line of
products was phased out in 1994 with the release of the TVGA9000i-3. The mainstream 8900 line was updated in 1993
with model TVGA9200CXr supporting up to 2MB of memory and VESA Local-Bus or VLB for short. Within the same year the TGUI9400CXi was also
announced with the same enhancements over predecessor chipsets, but this chip was also
a 32bit GUI accelerator. And yeah, the chips naming schemes only became
more complicated as time went on! In 1994, updated chips were released with
PCI bus and EDO memory support. 2 product lines were established – the cheaper 32bit
TGUI9440 and the more powerful 64bit TGUI9440AGi. The 64bit line of chips continued next year
with models TGUI9660, 9680 and 9682. All of them supported 4MB of memory, however 968x
models also provided video acceleration. In 1996 only one chip was released. The ProVidia
9685 arrived with a new product line naming and TV Out support. And finally in October 14, 1996 Trident’s
1st 3D chip was announced, the 3DImage 9750, using the company’s rCADE3D technology. Unfortunately, cards based on this chip hit
the market in June 1997, about a year later than competitive 3D accelerators from most
other vendors. Jaton, a 3rd party manufacturer, praised 3DImage
for its price to performance ratio and also greatly underestimated gaming industry requirements.
Jaton’s senior director of graphics marketing said the following: “The gaming industry has been frustrated
by existing 3D hardware solutions. The boards are either outrageously expensive or remain
unable to deliver the minimum 15 fps capability required for a 3D game.“ Looks like someone greatly reduced expectations
for their own product. 15FPS for a 3D game – really? These tremendous performance numbers could
be achieved thanks to the 64bit engine with AGP support, 83 or 100MHz SGRAM memory and
integrated triangle engine offloading 3D scene computing from the CPU.
The 9750 also supports DVD acceleration and TV Out. Sadly for users, most manufacturers released
their cards greatly underclocked, with clockspeeds ranging from 50MHz to 75MHz and some even
using older EDO memory. Luckily, in our collection we found a higher
clocked card at 91MHz. With 4MB of SGRAM memory and AGP 1x support it should be a representative
example of what this chip can pull off. The 9750 supports only Direct3D and the recommended
price in 1997 was only 129 US dollars. The card will be compared to a Diamond Monster
3D using the 3Dfx Voodoo Graphics chipset with 4MB of EDO memory clocked at 50MHz. In games with texture bugs, the Voodoo 1 will be replaced by an STB Velocity 128 using the Nvidia Riva 128 chip with 4MB of memory clocked at 100MHz. Final Reality, unlike with the ViRGE and Mystique
detects all expected hardware features. That’s a bonus point for the 3DImage. Also, we are no longer plagued by sound issues.
Sadly, the framerate is often too close or even below the promised the 15 frames per
second. 3DMark produces very low scores in the race
test, but a quite promising fillrate. The 3DImage attempts to render all textures
for a few seconds, but gives up really fast and ends without any of them, much like the
Matrox Mystique. The only game in our tests that ends up looking
good even on the Virge and the Mystique, Tomb Raider 2, shows the first serious problems
of the 3DImage 9750. The Helicopter blades and water transparency are not working and textures look unfiltered like they did with the Mystique. Forsaken is playable only in 320×240 and clearly
shows poorly filtered textures. This is visible mostly on fire explosion effects. Carmageddon 2 is just too slow to be playable. There’s no official 3DImage 9750 MiniGL
driver or wrapper for GLQuake and the unofficial D3D to OpenGL wrapper created by Techland
wouldn’t work, so we ended up using an official D3D to OpenGL wrapper for S3 cards!
That probably explains the yellow fullscreen effects. Once again, the game is too slow
and plagued by flickering wall textures. Oh well, it was kind of a long stretch anyway! Turok looks good for the most part, but is
really slow and the skybox is not rendered properly. Croc is relatively fast, but textures again
look almost unfiltered and the interface produces some transparency errors. Incoming showcases “nicely“ all of the
issues you may encounter with the 3DImage. It’s too slow, has poorly filtered textures,
a flickering skybox and transparency errors at every explosion effect. Half-Life works in D3D mode, but is plagued
by flickering wall textures until the card eventually gives up all texturing pretenses
and turns the whole game into a snow blindness simulator. Wipeout is actually quite fast compared to
most other tested games. The only perceivable issue is the non-transparent trees near the
track. Expendable suffers from a low framerate and
broken textures similar to how the Voodoo 1 renders on fast CPUs. Aliens vs Predator clearly showcases issues
with transparency in textboxes and features minor Image Quality problems with flares.
And yeah, it’s also pretty slow. As always the hardest test is the last one.
Will it run Unreal? The game actually runs in Direct3D mode and
for the first 3 seconds it looks promising. But then, the 9750 goes into textureless mode
in a vain attempt to increase the framerate. It is unable to finish the timedemo as it
freezes right before the end. There is only one positive thing we can say
about Trident’s 3DImage 9750. On paper, it supports all expected 3D features and technologies. But, on the negative side, there are lot more
issues with this card than there are features and accomplishments. First off, it’s too
slow, because it really does deliver on the promised 15FPS, if not less, in most games.
Also, texture filtering quality is poor and some games look almost like they are running
on software rendering. What’s more, transparency effects are plagued
by driver bugs and are often broken. You might also notice blinking or flickering
textures most likely in the sky and in all Quake engine based games.
Lastly, the 9750 has no support for OpenGL and it sometimes drops texturing in more advanced
games that were released at a later time. Mostly thanks to its 3D feature support, we
award the 3DImage9750 with two stars. The card is also a bit faster than the ViRGE,
but it’s still light years away from producing playable framerates. It’s also not a good
choice, thanks to the numerous bugs and relatively slow 2D performance. Thats all for today, thanks for watching and
as always you can now enjoy the demo of 3DMark 99. Or at last its first half, because in
the race scene, the 3DImage goes textureless once more.


54 Replies to “Worst Game Graphics Cards – Trident 3DImàge 9750”

  1. Stadium ARTs says:

    I could watch these videos all day long. Its exactly what I want/need/enjoy.

  2. esmeraldaygirasol says:

    this remind me cybernet

  3. Levente Frindt says:

    These videos are actually quite nice and interesting. Will you also test cards that didn't suck, like the Rendition Vérité V1000 or the PowerVR PCX1?
    And what about the 1.5th (2nd generation is Voodoo 2 and TNT etc. in my opinion) generation cards, such as the Virge GX2, Vérité V2000, PowerVR PCX2, Rage Pro, Riva 128?

  4. JohnnyNismo says:

    In 1997 my dad bought a Sony VAIO P200MMX system with an integrated ATI RAGE II+ 2MB. It was a frickin' potato. I later bought a PowerVR PCX2 accelerator and it was barely any better. I then saved up enough to get a Riva TNT PCI. Now that changed the 3D paradigm for me. This era began my addiction to framerates and forged me into the PCMR member I am now. Good times!

  5. Christopher Lord says:

    Great to see such a thorough history of Trident, thanks 🙂

  6. jdsm888 says:

    I had this card in my Pentium 2 266 system. I found that running games in software mode was significantly faster then running in "3D accelerated " mode.

  7. Miguel Que says:

    It wasn't good at 2D either.

  8. Ratto Poika says:

    I bought this card for half of an eur,, now I'm hoping someone would buy me for a full euro. o/

  9. Señor Dossier [Retro reviews, Coleccionismo & más] says:

    even the intel 815 igp is better :V

  10. Leo Hale says:

    That was a horror show, am soooo glad my first 3D card was a Riva 128 on a PIII 400, thank heaven I didn't have to put up with that.

  11. plk173 says:

    so glad this was autosuggested to me

  12. David McCormick says:

    Ty jsi čech? Cool!

  13. Zipplet says:

    This is the ONLY choice now that all current GPUs are being bought by crypto miners. 😉 At least it can't mine Ethereum.

  14. Ross Scarlet says:

    Now that is a real pile of shite

  15. MegaUnwetter says:

    Can it run Crysis?

  16. wilsonnw says:

    Trident TVGA9000 also supported VESA, I had one.

  17. unfa says:

    My beloved Unreal is butchered! Nooooo!!

  18. Marcin Kąpiński says:

    I've had TI 9750 AGP in my very first PC. Never used 3d acceleration with it. It was simply too slow and/or glitchy. Yet I had good time gaming in software rendering. I've changaed that card when games started requiring D3D to run.

  19. Vfl666 says:

    what a crappy card i had a diamond monster voodoo1

  20. Leberkas Semmel says:

    It was not that bad. But Unreal hurt my heart. Knowing what this looks like with Glide, with reflections, butter smooth, and THIS! Not even Textures!

  21. Scott Dinh says:

    I use to own two Trident card.. dam I got con thinking it was a real 3D "gamer" card. :##fail##

  22. Jeffrey Bozko says:

    my matrox g200 8 mb memory eats this thing alive!

  23. eri says:

    makes your pc games look like ps1 games

  24. Doktor Hachi Roku says:

    Most parts of this video is just a slideshow. I had NO IDEA the Trident is THAT bad.

  25. Raven Gaming says:

    Now I know how Unreal would look like on the SNES with Super FX chip lol

  26. duje225_Rebooted says:

    9:07 The helicopter rotors were a SQUARE on the 3DImage?! I'm so plagued!

  27. duje225_Rebooted says:

    sry i mean blades lol

  28. Eep386 says:

    This was basically Trident's counterpart to the infamous S3 ViRGE. Barely any faster than a ViRGE, and with much worse 3D image quality. Its saving grace is that its 2D was highly decent (though no barnstormer) and DOS performance was okay, so it wasn't a total bust.

    Trident wouldn't get the hang of 3D rendering until Blade 3D, which was their first 2D/3D product to not be wholly inadequate for low-res gaming. Unfortunately driver troubles were never fully resolved.

  29. Melchiah The Obscene says:

    That draw distance.

  30. Osystem says:

    It looks like the texture coordinates are calculated using integer values, and they shake like PlayStation 1 polygons :). I wonder if there is some PC GPU for that time, that also calculates vertex positions using integers… that way it would look just like a PS1.

  31. Michael Cox says:

    Yep, that smooth smooth 15 fps. :p

  32. Ytaken says:

    Teď už opravdu uvažuji o stavbě vtg gaming pc :/

  33. Todd Stewart says:

    I had a Trident VGA card once, it was in my old 286.

  34. Immense Data says:

    Hmm, a video with "the worst" in the title, 10 times the views than subscribers AND it was recommended to me, who never heard about you till now. Yep, you're blowing up.

  35. Katie says:

    Would have been better off with software rendering.

  36. PentiumMMX says:

    can i pair this with a voodoo 2 8mb

  37. PentiumMMX says:

    just as bad as the s3 virge but the s3 virge is better than the trident for 2d

  38. The Nex Reviews says:

    Geez even for the early 90s 1mb is freaking LOW. Lol recording fraps a like the highest 8 frames per second

  39. samljer says:

    saying pc master race would be upset by that card, is like saying
    they are upset in 2018 because the GTX 1030 exists……. stupid af…

  40. Hi There says:

    I had one in 1998. But it was with TV out. And TV out was only reason. Card was junk

  41. Juan Baroli says:

    There is only one card that can compete with the slowness and lack of features of the 9750: the Matrox Mystique (g100). That card could only generate a 3d image with polygones (no textures xD). Though it was great for 2d apps.

  42. Jaap de Graaf says:

    My 9 Year Old self tried to play Hidden and Dangerous on this thing which I got as a gift…. First great life shock

  43. PROSTO4Tabal says:

    you deserves nobel prize in best video about worse graphics card

  44. HighwayHunkie says:

    Haha i have this one, found it in an old retro machine and was like: "what a waste of AGP space" Does not even produce a clear sharp 2D video signal. Well good to have it, to show ppl what they (didnt) miss(ed) back then. Thanks for your excellent videos on old vintage cards. Good research and professional footage. Thumbs up! Your channel should have plenty thousand subscribers already.

  45. NumenVonKryt!!XD says:

    Yeah that card is legendary thanks to how shitty it is. I still remember how people were pissed when they bought this card and only to find out that Software Mode (if the game has it) runs the game way better in terms of looks and performance talk about money wasted.

  46. Raimar Lunardi says:

    wow, that's shitty indeed…
    my onboard chip from 1998 on a cheap motherboard was better than that…

  47. pc-sound-legacy says:

    Omg.. this game killing screaming civil pedestrians and dogs by car is the sickest i have ever seen.

  48. pc-sound-legacy says:

    Trident was nown to offer low end and cheap vga cards, but this one is a shame. This really was a 3d decellerator instead of accelerator 🙂 You should compare it with the SIS 3D Pro just for fun:-)

  49. Matt says:

    Works Grate in my 486-DX2 66Mhz ROFL

  50. Reza Ghaffarian Shirazi says:

    I assume some were driver bugs, rather than pure hardware issues.

  51. omarsis81 says:

    I really like your videos!

  52. Daniel Martinez Gonzalez says:

    My first pc, a laptop with a Pentium mmx had one of these gpus. It couldn't run anything 3d unless in software window mode at a very low resolution, of course all colours were fucked up. At least 2d was fine and let me play starcraft.

  53. BrandonExplains says:

    I never used a Trident card of any kind, but I did hear about them. A lot of people like to bash the Virge, but since it was released as the first mass market 3D accelerator at a low price point in Oct 1995, I tend to give it a thumbs up. S3 successfully released its product first, a full year before Voodoo1 and Rendition chips started hitting stores (literally one year, Voodoo1 was released Oct 1996). Since S3 Virge had particularly strong video decoding/MPEG performance and 2D performance (for the mid 90's), it was a top notch option when combined with the original Voodoo1. Trident was late to market and its product wasn't that competitive and 2D performance was slow which gave it no competitive benefit (and by 1997 competitors way outperformed it on both 2D/3D). S3 at least tried to compete with new products into the 2000's, although it failed financially against the former ATI and NVIDIA. S3 Savage3D was a far faster product than Voodoo1, and it was released in early 1998 (2 and a half years after Virge, and only a year and a half after the Oct 96 release of Voodoo1). The reason why I'm giving the background is because the 1990's saw a fury of companies competing and increasing speeds at breakneck pace: The S3 Savage3D was a top tier card when it came out, and yet it was only about 26-28 months after the release of Virge. We don't have that same breakneck pace today of advancement and speed improvements relative to older products. And all the people who complained about S3's drivers, I clearly remember they were always a heck of a lot better than ATi's buggy software. I wished that we could have at least had a 3 way race today, but S3 was the last to fall in the Nvidia-Ati duopoly that we ended up with. People forget, the Savage 2000 that was released in 1999 had hardware T&L, it wasn't the GeForce 256 that was necessarily first. S3 was a very innovative company in the 90's. But Nvidia won the business contracts and survived, S3 did not. So that's the 3D industry history lesson for today, class! 😉

  54. Ryan Yoder says:

    I built hundreds of Pentium class systems back in the day for customers at our Uptech Computers store and the appetite for paying 300 for a video card just wasn’t there. So we sold a lot of basic Trident cards. The 3D ones did disappoint. Most just used them for 2D and ignored the hardware acceleration. I know I did. I was programming in DirectDraw and a bit of Direct3D back in the day using retained mode and the HW acceleration was a joke. 1995 to 1997 was mostly about software rendering.

Leave a Comment

Your email address will not be published. Required fields are marked *