How Intel & AMD Made 3D Faster – 3DNow! vs. SSE

July 30, 2019 posted by

We’ve looked at MMX in a previous Byte Size. So now it’s time for the talk…. the 3DNow!
talk. In February 1999, Intel introduced the Pentium
III processor which included an update to the Pentium II MMX model called Streaming
SIMD Extensions, or SSE. These new instructions were known as Katmai
New Instructions up until their debut as the Pentium III’s codename was also Katmai. The cost reduced Celeron 533A and faster also
included SSE. SIMD, this initialism, within an initialism
stands for Single Instruction, Multiple Data – also known as vectorised instructions – allowing
the processor to handle the same operation from multiple data points simultaneously,
therefore increasing the speed of applications making use of this technology. This can be used for an example in an image
where the brightness is changed and the same value needs to be subtracted from multiple
data points. Rather than processing one pixel at a time,
SIMD can handle multiple pixels simultaneously. SSE is made up of 70 new instructions for
graphics and sound processing to build upon the MMX technology released 2 years prior. As well as adding further Multimedia enhancing
instructions, SSE allows for floating point calculations utilising a separate unit within
the processor instead of sharing the floating point unit like previous MMX iterations. There are also new integer and cacheability
control instructions, all of which are useful for imaging, 3D video, steaming and MPEG2
decoding, which reduced the need for a separate MPEG2 decoder card back in the late 90s. The fact that SSE supported four floating
point operations per cycle and allowed data prefecthing – a mechanism for reading data
into the cache before it’s actually called for – also meant the new instructions were
useful for 3D rendering, on any software which took advantage of it. Something which AMD’s 3DNow! was also intended
to help with. 3DNow! is really AMD’s alternative to Intel’s
SSE. I say alternative, but actually, 3DNow! was
really introduced before SSE in the K6-2 processor launched during May 1998. AMD had licenced MMX from Intel for the K6
released in April 1997, but didn’t want the expense of licensing the new SSE technology
Intel were developing for the Pentium III, so their solution was their own set of instructions
which were targeted specifically at the 3D market. A market almost every game had already or
was crossing into. As well as consisting of the exclamation mark
in the name, 3DNow! incorporates 21 instructions, again using SIMD techniques to operate on
arrays of data, rather than single elements. Like Intel, it offers single precision floating
point operations and enables up to four floating point operations per cycle. Data pre-fetching is also supported as is
the ability to mix floating point with MMX instructions, again like SSE. AMD’s claim was that 3DNow! incorporated the
name abilities as Intel’s new technology with less complexity, although due to their differences
software written to support SSE will not support 3DNow! and vice versa. Both technologies were supported by Windows
98, DirextX 6, SGI’s Open GL as well as graphic accelerator drivers provided by 3DFX, ATI,
Matrox and NVIDIA, although programs such as Adobe Photoshop failed to support AMD’s
offering, leaving SSE a little ahead in the professional stakes. But of course AMD, like Cyrix liked to aim
for the budget conscientious consumer, and the gamer, an equally lucrative market. Here’s how AMD explained 3DNow! 1. Physics – the CPU performs floating point
intensive physics calculations to create simulations of the real world. 2. Geometry – Next, the CPU transforms mathematical
representations of objects into 3D representations, using floating point intensive 3D geometry
3. Setup – The CPU starts the process of creating
the perspective required for a 3D view, and the graphics accelerator completes it. 4. Rendering – Finally, the graphics accelerator
applies realistic textures to computer generated objects, using per-pixel calculations of colour,
shadow and position. Outside of the support by graphics card drivers,
one of the games to directly support 3DNow! was Sub Culture, a game which AMD pushed for
allowing distant images to appear faster, objects to have more detailed textures at
a greater distance and images to have smoother fluidity and geometries, when stacked up against
the Pentium II. 3DNow! had an install base of more than 9
million by 1999, which AMD were hoping to push to over 30 million by the end of
the century. So where did these technologies lead? Well, SSE2 was introduced in November 2000
along with the Pentium 4 processor, adding 144 additional SIMD instructions, whilst being
a little ahead and with their goal in mind, AMD introduced an enhanced 3DNow! with their
Athlon processors in 1999, which included Digital Signal Processing and some further
MMX instructions from SSE. AMD reached their install base target, but
with Intel having their usual clout and still superior Floating point perfromance, this
Intel integration & licensing went further when AMD tried to re-gain some of the professional
market with the Athlon XP in October 2001 which included the full SSE instruction set
under the name of 3DNow! Professional. AMD would include SSE2 in their Opteron processors
in 2003 following a progression which apart from the odd misalignment would see AMD continuing
to licence SSE technology for compatibility up until SSE4 in 2007. There are new instruction sets after this
released by both AMD and Intel up to and including today’s processors, but this is really outside
of this videos – and my – scope. What isn’t outside it’s scope is 3DNow! So what happened to AMD’s own instruction
set which bridged their transition from MMX to SSE, and which was supported by various
games and programs throughout that period? Well, in 2010 AMD announced it was to retire
the 3DNow! Instruction set, marking the end of a 12 year
run, with Phenom II K10 derived processors being some of the last to incorporate the
technology. The only two instructions retained going forward
would be PREFETCH and PREFETCHW, leaving SSE to pick up the bulk and rendering any code
still making use of 3DNow! redundant So I guess that means ultimately, SSE won
this little battle. Although 3DNow! will always be a winner for
me, just for it’s incredibly ’90s name alone.


91 Replies to “How Intel & AMD Made 3D Faster – 3DNow! vs. SSE”

  1. Meow *Politely* says:

    im second

  2. Speedy DTM says:

    I remember seeing "SSE/SSE2 required" on buttloads of PC game boxes and always wondered what it meant. Very interesting video!

  3. Tefen Ca says:

    Another great video!

  4. Andrei Andreev says:

    Once again – competition made our experience so much better.

  5. Helio says:

    Did Intel really still have the superior floating point performance by the 00s? In general the pentium 4 didn't stand a chance against K7 on a clock-for-clock basis. I suspect the standardization around Intel's instruction set had a lot more to do with 'clout' than anything else.

  6. Peyton Lutz says:

    Before 1000 views?

  7. taketimeout2 says:

    You know what rings my bell. Graphics cards, AMD K6 and Pentium 3. And everything else since.(Lost interest when AMD brought out Bullldozer.) Exciting times.
    Cyrix? Never heard of them.(sarcasm)

  8. Charles Dorval says:

    Descent:Freespace is my best game ever, thanks for showing it up!
    It deserves to be better known, especially with the HD stuff released afterwards.

    goes to find a joystick hehehe

  9. jerrywh3 says:

    My first computer I built in ‘99 had an AMD K6-2 500 in it. Paired with my Viewsonic 21 inch monitor it made for some great gaming experiences.

  10. Jikyuu says:

    3D Now! … that ! indicates a 'not' … so … Not 3D Now … 3D in a bit?

  11. Tiikoni says:

    I must say I was a bit distracted by those crazy ads on background 🙂

  12. Pelger says:

    k6-2 500 was the worst processor I had, everything ran slower than in a pentium 2 300!!! almost nothing that I had or liked used 3dnow 🙁 when I was lucky and I found something with 3dnow support it ran fine, but I remember like almost no games used it. (maybe not the ones I played? maybe all the sports or racing games used so it was good for "general audiences"?)

  13. rbmk 1000 says:

    great vid, love the 90s stuff however I'm quite sure the Athlon fpu kicked the p4's ass, since they were all around slugs and could only compete with and in later high clocked versions, the overly complicated netburst pipeline is one of Intel's biggest boondoggles and eventually was abandoned, worthy of a video?

  14. WDeranged says:

    3DNow! netted me an extra 10fps in Quake 2 back in the day. I remember being rather pleased about it.

  15. Objectorbit says:

    I've never heard the angle that AMD didn't want to pay the cost for SSE. Do you have a source for that?

  16. SteelCity1981 says:

    cpu's are too cluttered with instruction sets now take SSE for example SSE 4.2 can do everything SEE 1-4 can do, so there isnt a need for SSE 1, 2, 3, 4. 4.1 to be on cpu's anymore. by simplifying the instruction sets you'll make the cpu more efficient and more streamlined..

  17. mstcrow5429 says:

    SSE originally shared registers with the x87 FPU (due to transistor budget?), just like MMX. I think SSE2 fixed this.

  18. asdaffewwerqa asafdaqwrad says:

    oh man, I remember when those processors hit the market. Exciting times.

  19. Philip Storry says:

    I felt sorry for AMD when it came to 3DNow!/SSE.
    If they don't do something, then they're clearly behind Intel. So they must do something.
    Whatever they do isn't likely to get as much marketshare, so it gets less use, and therefore there's not a huge return on that development effort.
    And yet, as usual, AMD reminded Intel that they were doing a crap job. (Intel don't always do a crap job, but when they do AMD are very good at reminding them!)
    The lack of marketshare is a pity, as some of their designs have been bloody wonderful. I remember my Athlon fondly. That was a great processor.
    There have been times when AMD's engineers clearly did a better job, and I think 3DNow! just crosses the line on that one.

  20. Hubert Hans says:

    3Dnow, if used sucked. It was funny: Sometimes you got more FPS. But the game stuttered slightly while aiming around. With SSE, it did not. You can see it in 3D Mark 99 or 2000. SSE, if a high FPS was achieved (Like over 100FPS) it was smooth. On 3D Now: Slight stutter. And its plausible: 3D Now is inferior to SSE. 3D Now only has 64Bit Registers. SSE 128Bit. And it has more registers, also.

  21. Ross Scarlet says:

    Nothing better than starting off a video with a cracked voice!

  22. Retro Amateur says:

    MMX my love


    I like how thorough your investigative videos are. I'd have liked to see more on this!

  24. TechBaron, Cameras and more! says:

    love me some k62. fav CPU from back then

  25. merlingt1 says:

    I had a K6-II 266Mhz? if I remember right and it was a pile of crap. I spent a lot of my money when I was young for this upgrade and I was throughly disappointed.

  26. ChatBoxGuy says:

    Pentium 4 was crap. The Athlon XP ruled! You remove such facts like by 2004, Intel processing graded like crap with the hardware so it was choppy as hell which was why a lot of techs were using the AMD athlon XP processor. However, Pentium had managed to gain contracts with a number of companies such as Dell and so on to have computers made with there choppy processors which were cheep which were what the average American would by without paying attention. You will notice the blue men drumming they are to entice typical teens and young adults of the time to BUY the cool toy. Anyway, AMD focused on those contracts, but they prized there use of higher end hardware as GOOD tech preferred them so there contract with the motherboard winner maker Gigabyte was preferred. They managed to hang on until Microsoft was bought out by another group looking to finish Vista in its failure after launch. As a reward for a contract with Intel as a number of other sources they went out to change the mathematics of the balancing to favor intel's finicky processors. AMD's processors in the offset were a bit slower AFTER, but sense then they have become equal although it is CLEAR in recent times that BOTH are becoming slower as the chips on the motherboards are choppy in speed as time traveled on. Chipset processing is becoming sluggish which is why now the BEST computers for either AMD/Intel processors is quadcore. By being able to perform 4 actions at once they are able to split up and speed up accessing by using 4 channels and when is finished replace it with another set. This makes it so more activity can happen. This is WHY President Trump who actually cares about the economy has initiated research in the last few months on research for higher and faster processing chips and WHY Microsoft Believes PC tech has outlived itself and is researching Virtual SSI software which ALL previous attempts are a flop, but I believe that Trump knows that Virtual SSI software CAN be a nice toy for gaming and home play, but office computers and such for programming are traditionally done on a mac or a PC. I haven't used a mac, who knows they might be faster because they hold back from higher graphics for fear of processing overload. I don't use macs cuz there too darn expensive. Anyway, in 2014 threw 2016 technology is advancing in certain areas, but not PC's. Gaming on the X-box versus PS-4, tablet computers, i-phones, Microsoft jumping ship for other ideas in research. This actually gives AMD a advance because they have been continued to be advance performance for the only other operating system in existence. If Microsoft kills themselves off as a business or doesn't return to make another operating system the REMAINING company although there traditional practice hasn't been to focus on Gaming will have too finally concede to support it can get together with other companies such as AMD and Geforce to make a real operating system like no other. You seem to miss that as of 2017 computer technology which still rules our lives VERY much just as the tablet is at a insurpass. Without the hardware problem which Obama ignored to correct, which trump is. But the point is Intel hasn't been better BECAUSE of better research is because of CONTRACTS. LOL, Intel was good in the early 1990's, but they edged off in performance quality.

  27. D. B says:

    I completely bought into the 3d now advertising. I was always yelling at my friends that my computer was better because it had 3dnow

  28. mactipiak says:

    My first proc in my own rig 🙂 blowing up my savings on an K6-2 333MHz… In a case made by: Amstrad!

  29. Adventures in Nostalgia says:

    I really enjoyed this video! This one and the screen savers are probably my favorite videos of yours so far

  30. Arnaud Meert says:

    Damn, Intel adverts were pretty awesome back then.

  31. 387534 7582782 says:

    irony is my mates k6-2 was absolutely shite at 3d compared to my Pentium 2. both with a voodoo 3 2000.

  32. Xilefian says:

    Compilers these days can sometimes take advantage of these instruction set extensions without you needing to put much effort in, so even a basic program that isn't even game-related (with no 3D graphics) could get compiled to use 3DNow/SSE instructions for performance boosts. It doesn't happen as often as some people like to believe, but compilers will do their best to use extensions in cases where they are available.

    Generally, you will get better results and performance if you specifically use SSE intrinsic code and manually program the optimisation than hoping the compiler will take care of it, but the compiler certainly can (and will) take care of it when possible.

  33. Xilefian says:

    MMX is an interesting one as that was pushed specifically as a way to have high-colour (24-bit) graphics at the same performance cost of 8-bit palette graphics, which is certainly the case with MMX (writing 3 bytes in 1 instruction for RGB, versus 1 byte in 1 instruction for palette index) however if you had a processor capable of MMX, chances are you had a GPU accelerator too, so despite gamers and the gaming press gobbling up Intel's message of MMX being a graphics revolution, games didn't use it and went straight to offering hardware accelerated graphics (with 8-bit palette software mode available for compatibility).

    Intel were in a strange war against hardware acceleration right up until 2010; Michael Abrash (famous low-level x86 PC programmer, helped work on Quake, was at Valve for VR, now Oculus) even worked on Pixomatic (software DirectX-7!!!) which was acquired by Intel, and then turned into project Larrabee (Intel's failed GPGPU project, was supposed to be an Intel GPU-CPU hybrid that competed against NVIDIA's CUDA). They figured out it makes more sense to focus on their integrated graphics (which are getting better and better and will play a big role when multi-GPU capable graphics+compute APIs like Vulkan get fully explored, AMD's APUs would be absolutely wonderful in this field).

  34. Rex Warden says:

    My first computer was an AMD K6-2 300 with 3DNow!, and I still have it. It's in a new case, as the original literally rusted to pieces, but the motherboard and I/O shield survived. The hard part is finding a purpose for this system, though I've almost maxed it out as far as upgrading goes. Maybe I'll fill up its ISA slots.

  35. wildbilltexas says:

    The K6-2 had a weak floating point for 3D gaming, so I saw 3D NOW! as just a quick fix or advertising hype. I remember several games including Quake that had 3D NOW! patches on their websites so they'd play faster on the K6-2.

  36. Jonathon van der Wijngaart says:

    Still running one of the 3D-NOW! chips in my daily driver, the Phenom II X6 1100t Black Edition, bumped up to 3.8Ghz base 4.2Ghz boost, due for an upgrade at some point

  37. Borna Prpic says:

    Another great video. Keep up the awesome work. From a guy, born "beyond the iron curtain"ish" " in 1987. Played on Amiga 500 from age 3 and got my first PC then (286) which I bumped up later (much later after my clone nintedo famicon, cousins sega megadrive, my clone atari 2600) to an AMD 5×86 133 MHz 🙂 I think a long story about AMD should happen. It's an interesting one and since Ryzen came out this year. It's a great budget option wise and has almost always been (with dips here and there but hey. It's not a behemoth like Intel hehehe).

  38. Ichabaud Craine says:


  39. Antel Drobat says:

    One of the best channels on YouTube

  40. Tanguy Fautré says:

    Hi. Where did you manage to find such a high quality video of 3Dfx adverts?

    IIRC there were three ads: food, medicine and environment. Would love to download a high quality version of each. The youtube versions are highly compressed (i.e. poor quality).

    Never used 3D Now! But did a lot of programming with SSE/SSE2. Nowadays CUDA and GPUs really take the whole SIMD concept to a new level, the recently added AVX512 feels a bit underwhelming in that regard. We're now programming new apps using CUDA, never AVX512. Intel may have won the old 90's SIMD battle against AMD, but NVIDIA is certainly ahead of Intel when it comes to massively parallel architectures and instructions.

  41. The Trivium says:

    Powered many an hour of Wargasm!

  42. Waldon Newman says:

    I got to be honest but didn't understand a word of it.

  43. ShadowKat Studios says:

    Had a terrible VIA C3 chip with 3DNow! specifically.
    It still sucked.

  44. Jeremy Johnson says:

    Ah, the K6's FPU. A huge thorn in my side until I got a P3 833 in 2000. It did advance FP SIMD but god was it a dog outside of SIMD. Celeron Mendocino wiped the floor with it.

    By the way, Intel could not compete with K7's floating point performance. Certainly not.

  45. chrisbinghamton says:

    Very informative! In my CS classes, though, I've always heard SIMD pronounced as "sim-dee".

  46. Dashiell Barlow says:

    AMD's retirement of 3DNow! had some interesting consequences, particularly with sloppy, rushed, or otherwise sub-par ports to PC from console. 2007's Mass Effect is a very good example of this: on 2 (or 3, it's difficult to ascertain exactly how many of the graphical glitches are related unless you're some sort of forensic programmer, and I am not) of the game's 4 major plot planets, there are areas which refuse to render properly if you have a non-3DNow! AMD CPU. There are mods to mitigate this issue, but unless a player is aware of the Nexus (in a larger sense than, 'oh, that's where I get fixes for Fallout/TES'), or the PCGamingWiki, or some other source, it's simply going to have unplayable required segments.

  47. EarthboundX says:

    Haha, another video I didn't understand half of.

  48. MickeyGoneWild says:

    A bit too nerdy?

  49. SpAM_CAN says:

    SIMD wouldn't have been able to modify multiple pixels at once, but it could modify every value of a colour at once, since SIMD traditionally only uses 4 data points at once, and a colour is made up of red, green, blue and alpha values. Otherwise, a brightness operation as you suggested would require modifying every single value of a pixel separately. In maths like this, SIMD grants a 3-4x speed improvement.

  50. Leeki85 says:

    I had Athlon 1.3 GHz and lack of SSE instructions made this CPU obsolete way too early. It was just unfair to get a game that required Pentium III 800 MHz like Tomb Raider Legend and get a big "This game requires CPU with SSE instruction set" notification.
    The same was with almost everything that was released after 2004. I still have that computer and for fun I installed Windows XP on it few months ago. Almost no modern software works. You can't run Steam, which requires SSE2. You can't even run modern browsers. Opera 12 is probably the newest one that don't need SSE.

    It's probably Intel influence to shutdown 3DNow! When you make software in C++ or other popular languages, you don't use any instruction sets explicitly. Compilers do the hard work, and optimizing for SSE is just one options. There were really no reason to don't support 3DNow! ten years ago. It would just make binary bigger by 1-5%. Even assuming if it would work slower than SSE, Athlon 1.3 GHz still would be much faster than P3 800 MHz.

    There's really a bad trend of abandoning older hardware, even if it's still capable. Resident Evil 7 was a game that didn't even run on i5 2500 and other CPUs from that period, because binary was compiled with requirement of newer instruction set.
    Apple is just famous for their planned obsolescence. Each year they made some of their devices useless. You can't make 32-bit apps for iOS anymore, and Macs are being stupidly discarded. Safari just won't process HTML5 on CPUs older than Sandy Bridge, while Chrome Opera and Firefox won't have any issue on the same computer.

    I'm the type of consumer that if forced to upgrade, will switch to competition products. This is why I'm using Intel CPUs, Android Phones and Nvidia GPUs. Although I might get Ryzen in a next upgrade if AMD manages to fulfill their promises of extended suppport for AM4 socket.

  51. The Furry Gamer says:

    Speaking of the Pentium 3, I know that processor kept on being used into the early 2000s

  52. Chris McFee says:

    I had a k6-2 350mhz. I loved it. i remember quake had a 3dnow patch. it was awesome!

  53. Chris McFee says:

    I had a k6-2 350mhz. I loved it. i remember quake had a 3dnow patch. it was awesome!

  54. Smartzenegger says:

    So now it's kinda like "3DThen!"

  55. slightlyevolved says:

    I've really like these videos on the MMX/3dnow/SSE features, and I know it's not your normal thing, but since you've covered all these SMID types; have you given any thought to also doing one of the PowerPC's AltiVec instruction set?

    As I said, not quite your normal field of x86 topics, but in context, I think one that would be very interesting as a piece of this series.

  56. Savage Scientist says:

    PowerPC was killing Intel in the mid 90s

  57. undeadelite says:

    3D now? More like 3D soon…

  58. umageddon says:

    Maybe you could talk about the Duron and Celeron 300B and overlcoking in that era

  59. mapesdhs says:

    3D Now!… The Buddy Christ of tech marketing names. 😀

  60. Julian Uccetta says:

    😀 I have an old Compaq sitting next to me with an AMD K6-2

  61. Mac Daniel says:

    3DNow! was nothing but a marketing gag. Every AMD CPU was slow as hell. I had some K6-2 processors back in the days because they where cheap. One of my friends had an Pentium 2 350 System that outperformed my K6-2 500 by far.
    But today I like the K6-2 processors 🙂

  62. John Smith says:

    Pentium 4 was total SHIT! I avoided that processor I was proven right until core 2 duo came into scene… but I hate Intel's consumerism, if you mobo dies, you have to buy a new cpu+mobo combo… with AMD that only happens with certain amount of time, since I have AMD, I have changed the motherboard, next the CPU years later the mobo and so on… with intel, every year they change the architecture and make the hardware not compatible with each other after less than a year… I will NEVER buy an intel based computer because I don't want to go bankrupt…

  63. tsartomato says:

    1 you screwed up aspect ratio

  64. Pp Lime says:

    Intel, nvidia, creative stole from competitive (basically everything advance at that time)

  65. Z Ye says:

    U are a gem of youtube!

    Keep up the good work

  66. Grgazola says:

    Yay for data prefetching, on a side note.

  67. Achilleas Labrou says:

    For real 3D performance only a graphic card with powerful GPU was adequate. Of course for laptops or very cheap motherboards with intergraded graphic card this technology was good.

  68. George Hilty says:

    when i was younger, i had a Pentium 3 running at 500 MHz with SSE, and he had a Pentium 2 running at MHz, and omg what difference there was between the two! there wasn't much difference in speed, and everything else was comparable. but my Pentium 3 would run circles around him in gaming back then!

  69. methanbreather says:

    you failed to mention INTEL doing their usual shady/criminal stuff… which was also a factor and resulted in lots of fun for the lawyers.

  70. gallwapa says:

    Love the Freespace clips 🙂

  71. Daniel Roberts says:

    Fucking idiot, comparing Pentium III era SSE to Pentium era MMX. I hope you develop prostate cancer and die.

  72. Jared Garbo says:

    1:28 90's as fuck.

  73. Tim Evans says:

    What killed it was the advent of graphics cards with 3D functions, they ran as CPU units on the card.

  74. Wilson J says:

    I almost forgot how cheesy many of intel's ads were back then.

  75. Dalle Smalhals says:

    ..and now we have SSE 4.1 & 4.2 for DRM ;-D

  76. Jari Haukilahti says:

    So fast is the fastest ( latest )APU/CPU with 3dNow ? (AMD 3870 ?? )

  77. statikreg says:

    huh…didn't realize 3dnow was discontinued…! – makes it look like it was still supported until Bullshit……..I mean dozer…..

  78. Darren Cole Gold says:

    Reminds me of my first 'gaming' PC, slot one P3 with 100mhz of frontside bus, simm style ram that had to be installed in pairs (I doubled mine to something still pathetically small), upgraded the HDD and clean installed windows 2k, and fitted a GeForce 4 4200 to it's AGP port, playing the shit out of CS 1.5, and quake rocket arena, getting together with highschool friends to LAN on weekends.

  79. Hunter's Moon says:

    Still got my K6-2 500MHz

  80. Micro Master Retro Edition says:

    Excellence as standard Peter😁😁😁Kim😁😁😁

  81. JoeStuffz says:

    Microsoft took advantage of every 64-bit CPU having SSE2, making it a requirement for 64-bit Windows. 64-bit Windows will accelerate certain floating-point operations by running them using the SSE2 unit instead on

  82. Richard Vaughn says:

    Really though the whole SSE MMX etc stuff stopped mattering after AMD64 came out because it transformed into a core count and memory throughput weapons race.
    We still have an entire ocean of instruction set extensions that do all sorts of things but nobody cares because using them makes your software not work on many computers and they don't affect gaming anymore.
    Basically in todays world you set your compiler flags to target the basic AMD64 instruction set and things just work.
    But really though, every CPU on the market today has like an entire list of instruction set extensions that most people have never heard of and rarely use.

  83. Tetra Digm says:

    ahh the days of the Thunderbird. the only time AMD had anything worth buying. i remember going directly from a 950mhz Athlon to a 2.4Ghz P4 and thinking "meh. windows loads 5 second quicker, and the onboard video has more memory." but was otherwise unimpressed with the supposed large difference between the two.these were also the day when gigabyte made products worth buying, that didnt shit out in 6 months.

  84. Retarded Cosmo says:

    freehdeeh nine tok.. good dude that a accent is awful.

  85. ML says:

    Wow a Descent Freespace sighting.

  86. Cavey Möth says:

    My Phenom II X6 1100T supports 3DNow! and I am darn proud of it.

  87. Una Salus Victis says:

    would have been nice to have had some side by sides of comparable setups, for example, a celeron vs a k6-2(or the rare 3 model!!!) , where 3dnow made the k6-2 more capable then even an overclocked celeron, (i owned both, anything that supported 3dnow, ran so much better on 3dnow…it was crazy…)

    there was even a 3dnow-powersgl binary for a few games that those of us who had power-vr cards made great use of..not sure if its even around anymore…. it was hard to find even then, and was never official, but, my god, 3dnow stacked with 2x matrox m3d cards was actually pretty strong… 1024 maxed out was no problem at all, games just maxxed out…. it was great… it made me sad they depricated them, i mean i get why, but would kinda like to see how it would effect perf of older games on say an FX line cpu… since i have not heard of anybody compiling unreal(classic) or quake with modern optimizations…. though the idea isnt bad, could make running them at high settings on very crappy hardware very doable…. pandaboard at full 1080p on classic windows games?

    anyway, i knew all of this, but, for those who didnt, good info, some mention of how much it boosted the k6-2 vs what it was competing against price wise would have been nice, it wasnt an across the board win but, any game that supported 3dnow, really showed a huge gain, my friends who had overclocked celeron 333's where all jelly of my k6-2 that i clocked to 566 or something close to that(board would do 133 bus), without 3d now they won most times, with it…. even games that chugged for them, ran playable smooth for me, another advantage was, the platform cost ALOT less then the celeron rigs of the day.

    IF you do setup a k6 rig, you really need to delid the chip carefully, and replace the thermal compound, even back in the day it was pretty crap, by now, its dry AF, helped a buddy build a k6-3 system a while back when he got ahold of 2 working chips, with the right board, we got around the same clocks, once i took the lid off one and replaced the compound the temps dropped 15-20c (depending on what thermal sensor you believe.. we also had 2 externals in that little lip where the ihs is glued down, put some ASC in there, used a couple dabs of superglue on 2 corners, and put it all back togather, we got a stable 666/667mhz out of it… no joke, its still running a couple years later, and gets used for classic games alot, his sons friends will get on his classic systems and play old games like Unreal/quake/etc for hours on end, and the k6-3 was a very nice chip… sad its so rare…. i didnt even know about it back in the day… then the athelon, then duron came along for those of us on a budget… 🙂

  88. Tony Nameless says:

    I never had any good experience with AMD processors.
    I never had any bad experience with INTEL processors.

  89. szt1980 says:

    3DNow! was more advanced: it allowed horizontal ops since its inception, something that SSE could only do since the PNI

  90. Eugene Miyelis says:

    I think AMD shared their x64 CPU instructions with Intel for all current and future SSE sets in return. So win/win.

  91. Ali Mehmet says:

    YES, I have the AMD Phenom II X6 1100T with 3DNow! technology.

Leave a Comment

Your email address will not be published. Required fields are marked *