Are Computers Still Getting Faster?


[cassette door shuts] [keys clicking] [landline phone ringing sound, electronic chords] Welcome to another episode of The 8-bit Guy Now before we get started, I wanted to draw your attention to this beautiful plaque that YouTube sent me congratulating me on reaching 100,000 subscribers. Now, actually, it took ’em a while to get me the plaque because I’m actually nearly hitting 200,000 subscribers now. And I just wanted to say that when I started the channel I had no idea that it would ever be this popular, and I have you guys to thank for it because you guys are what keep me motivated to keep making these. Recently I did an episode where I explored a 10-year old Core Duo laptop to see if it was obsolete yet. Surprisingly I found that it would still handle any modern task and I so I began to wonder:
Has Moore’s Law finally come to an end or has computer progress finally slowed down? After all, if you look at the last 40 years of computer history, and you do the exact same experiment you’ll see what I’m talking about. Try going to Year 1990, and look at that computer versus a computer from 10 years earlier. They aren’t even in the same league. So I started doing some research and taking a look at various aspects of computers. I started with memory, and looked at intervals of 5 years starting in 1975. At this point the typical computer had 1 or 2 kilobytes of RAM. Five years later that amount increased by a factor of 16 and five years after that increased again by a factor of 16. Surely this trend of exponential growth could not continue, right? Well, in order to go forward I’m gonna have to shrink this chart down some. There, that will give us some more room. Over the next five years, memory doubled. That’s still nowhere near the growth we saw before. Let’s shrink the chart down again, and look at the next year. This time it quadrupled. Let’s shrink the chart again, and wow, another 16-fold increase in RAM. Let’s shrink again and check this out and this, and, look where we are today. So I think we can safely say as far as RAM goes, there’s been no slowing down on that. Ok, so what about processing power? Let’s just look at clock speeds starting in 1975. The typical speed was 1MHz. Five years later, there was no real change and in 1985, we made some progress, and that progress continued into 1990. We’ll need to resize the chart again. You can see the CPU speed really started to take off by the year 2000. We were already at 400 times faster than we were back in 1980. Now, we’ll have to resize again, and you can see continuing exponential progress in clock speed. Oddly enough, you’ll notice it peaked, and by 2015 it looks like we went backwards. And well, there are multiple reasons for that for one thing, clock speed is not really the best measurement for determining a CPU’s total power. Another thing to consider is that these computers started off being 8 bits and by the mid ’80s everyone was using 16-bit machines, and 10 years later everyone was using 32-bit machines. And only recently did we start using 64-bit machines. So keep in mind besides just the clock speed, these things can process a lot more data for each clock tick. And not only that, but most of your modern computers have more than one core and a core is like its own CPU in and of itself. So, most computers have anywhere from 2 to 4 cores, and some of them even as many as 8. So, you can see this chart is not really reflective of CPU power. So one way you might measure the difference in raw power would be with something like Geekbench. You could see the original Macbook released in 2005 gets a score of 2,287 where the latest Macbook gets a score of 6,350. So, if I were to modify this chart to be more reflective of raw CPU power, it would probably look more like this. I also compared things like graphics, resolution, and hard drive capacity and I found the same exponential growth that we’ve seen in the other aspects of computers and that continues up to the present day. I even compared average cost of a home computer in this chart, and here’s what it looks like adjusted for inflation. You can see the cost dropped quite a bit for a while, and it’s kind of leveled out, it’s even going back up a little bit. That’s probably a result of most people switching from desktops to laptops. So the question I’m trying to answer is: why is it that a 10-year-old computer today is still usable but if you go back not too far in the past, a 10-year-old computer was always obsolete. If computer progress has not slowed, then what’s the explanation? Well, I have one possible theory, and it actually has less to do with the computer’s hardware, and more to do with the computer’s software. It goes something like this: Now this chart is not scientific, but hopefully it makes sense. If this bar represents the amount of CPU power a computer has then this part represents how demanding the software is using up almost all of the computer’s power. Once a new computer comes out that’s a little bit more powerful the owners of this computer can enjoy a really speedy computing experience. That is, until the next year when new software comes out that requires a faster computer to use. And then the next year the cycle repeats, and repeats again. And so this game of cat-and-mouse has been going on for decades. However, I think what may have finally happened is that now with each successive year, the software is still getting bigger but not at the same rate that computing power is increasing thus making older computers still capable of running modern stuff maybe not as fast, but it still works. Well so that’s MY theory. But I thought maybe I should ask a few other computer experts and see what they say. One thing that sort of stood out to me when you were talking about all that was the graph where the processing power overall went down around 2010 or so. And that sort of lines up with an interesting time in the personal computer market where a lot of things like netbooks were getting really popular, tablets were getting there they were a couple years off probably for being really mainstream and stuff like Chromebooks and lower end laptops were all the rage for a time. And a lot of that I think coincides with the matter of usage, that changing around the same time as well where a lot of people were using web apps and just stuff in-browser as opposed to needing dedicated applications and things like that. And also, a different age range was starting to use computers way more often at that point. Older generations and such effectively has doubled since 2006 and a lot of that is basic social networking, photo sharing, even videos which don’t take a ton of processing power a little more RAM, especially for browsers, but, yeah, I think it’s overall a lot of it has to do with a change in usage. Thanks David, I agree with your theory, and I think I have a couple of reasons why I think this is happening or a couple additions to what you’ve already said. One is that with Windows Vista, Windows 8 failing so badly, that caused Windows XP to have a lot longer life, a larger market share that wouldn’t have happened otherwise. And also, there’s this big shift to using a web browser for basically all — all your basic computer tasks can be done in a web browser nowadays. So, because Windows XP was so prominent, that kind of forced the web browser developers to continue to support Windows XP, and so you can download the latest Javascript/HTML5 enabled Chrome or Firefox for Windows XP even today, and it works pretty good on your old computer. Great question David. I’ve got a pretty good theory on that. So if you go back 10, 15 or more years, I think you’ll start to see a shift in number of power users to regular users. Now if you bought a computer in the early ’90’s, you were more than likely buying it for a specific purpose, to perform a specific function. While that’s still true today, most people just want a window into the Internet. That’s why ARM-based devices like iPads are popular. Your grandmother may not have had a computer 15 years ago, but she probably does today and more than likely, she uses it just to surf the Internet. So while computing resources might not be as much in demand as they used to be memory resources are still very much in demand. I’m sure you’ve seen your web browser running under Task Manager consuming gigabytes of memory. Hey guys, it’s Mike from the Geek Pub. I’m going to answer this purely from the consumer standpoint because in the enterprise we still struggle daily to get enough horsepower to run these large customer data applications whether it be on something like Hadoop or Bigtable or those types of things. Also in the stock market where we need to make split-time — or, uh, split-second real-time decisions whether or not we should make a stock trade and so we’re still struggling with that. I think that will probably go on for about another decade. On the consumer side, I think David and Rob really covered it well. But I think there’s a third piece, and that third piece is the cloud or a fancy way for saying “someone else’s computer.” So, it hasn’t been that long ago, if you wanted to do banking that you would install a very bloated app on your computer called Quicken and you would download all of your bank data and you would analyze it and print checks and all the types of things that you would need to do to run your home finances. And, so today all you do is go to www.”your bank”.com and you use the website to do that and all the data processing happens on the bank’s computers, and your computer is just a window. And so, yeah, I think that’s the third piece. Much of what we used to do on our computers is now happening on someone else’s computer. Alright so leave me some comments and tell me what you think is the reason. Also, for those who may not have seen it yet, I just kinda wanted to point out that I do have this other really cool channel. It’s called 8-bit Keys. I do a lot of the same stuff as I do here, but it’s mostly to do with music synthesis. On that channel I explore everything about electronic music from the 1980’s including FM synthesizers and computers and game consoles from that era. Also, I occasionally put together these little performances using nothing but vintage toy keyboards. ♫ [synthesized ballad] ♫ ♫ [ Synthesized version of “Canon in D” in the style used in the “Lemmings” DOS game ] ♫ Ok, so I put a link down in the description field, so be sure to click that and check out that channel. And I’ll see you next time. ♫ [ synthesized outro music ] ♫

100 comments on “Are Computers Still Getting Faster?”

  1. Erick Alfaro Ceciliano says:

    Mi 8 year old computer still work fine for me, meabe when it die y need a fast computer, because all I need work well today

  2. georgysb says:

    The stagnation is more obvious from 2019. My 5 y.o. desktop comp still handles everything I need. We really have already gathered all apples from bottom branches. Computer technologies have finally reached the state of saturation. Prepare yourselves to upcoming centuries of 64-bit machines.

  3. Anturius says:

    In the past programming was an art, today it is a compilation of libraries.

  4. kjell159 says:

    I think smartphones and tablets taking over a big portion of the computing marketshare, and so a lot of development going into that, has also a play in it.

  5. connect2reality says:

    No mention of the physical limitations of creating faster and faster computers? You can only slice up silicone so thinly before quantum effects start getting in the way.

  6. UndefinedEssence says:

    I wish I'd have seen this when it was posted a couple years ago. I noticed nobody in the videos mentioned much about the gaming market. I agree that our biggest hurdles are the physical hurdles of greater clock speeds; I think I only saw this mentioned in the comments. There is always a group (gamers) that would love to see greater processing power, despite what else is going on. So, a limit on processing power based on consumer demand seems unlikely.

  7. Hasanchik Gaming says:

    the subs are right now:945K

  8. Neb6 says:

    In the 90s, this question would be unheard of. Nowadays though, one wonders how much we're really progressing.

    The software putting increased pressure on CPUs with each new version was very apparent in the 90s and early 2000s. That and people multi-task like crazy nowadays. Then there are those web browser plugins.

  9. VicVanProphet says:

    I know you will find it weird.. but what makes your channel so successful is your voice.. and how pleasant you are that its kinda relaxing to watch.. plus off course im from the 80's so it brings good ole memories.. and we also learn to fix stuff.. thanks to your wonderful well edited videos..

  10. The Geek on Skates says:

    Okay I know this is a very old post, but I wanted to comment on the software side of this. I don't know enough about the hardware side of things like clock speeds, cores and all that – I understand the concepts, but as a coder I see it from a totally different perspective: most programmers are not all that concerned with memory management anymore. That's what the garbage collector or interpreter are for. I build web apps at work and memory use is not even a thought until an app needs to scale. But at home, I've been learning to code for retro systems like the Vic-20, C64 and others, using a C compiler. And when you're building for systems where every byte counts, you find you do things a lot differently. You can store 8 boolean variables in a single byte, so instead of checking if some int == 1 I'm checking if some unsigned char & SOME_CONSTANT (lol). But you can't do stuff like that in high-level languages, so as the one guy said we now have web browsers consuming gigs of RAM; the JavaScript interpreter works overtime, casting variables to different formats, converting arbitrary data objects into a format it can use, and the list goes on. I don't think that's the ONLY reason, but I do think it's a reason. If we paid a little more attention to how we use our resources, slow computers would not be a thing. But that's just one geek's opinion I guess. 😀

    Anyway, thanks for the great video. Even though it was from ages ago I think this discussion will continue to be relevant for years to come.

  11. Mike G says:

    My 1999 ibm has 500 mhz

  12. Χρήστος Κώτσαρης says:

    It is just that we reached "good enough" performance for most multimedia applications the mainstream will ever going to use. Video apps, music apps, office apps, even games (especially indies), do not need that much horse power anymore. On the consumer space, outside of AAA gaming and possibly VR there is no reason to have a very expensive computer anymore. Even things like video editing can be done fine on budget systems, they might take a little longer but unless you are a professional needing to do many video edits per day professionally, you shouldn't care if your grandma's video takes a little longer to edit…

  13. jackno7daniels says:

    Sweet shirt!

  14. Muh Dewd says:

    I always thought that using clock speed to compare processors was a mistake, as some processors can do more per clock than others.

  15. George Luyckx says:

    With neural networks and VR we are still hitting the ceiling. Robots and drones will need a lot of computing power to analyze real time images.

  16. Jason Gooden says:

    Watching in 2019, the answer is yes

  17. Edmilson Alves says:

    Looove your videos.

  18. Gary Craft says:

    almost a million!!!!!!

  19. N English says:

    10 year old mac maybe, not ibm pc.

  20. The Dark Knight says:

    Computers reached their peak in around 2005.

    They cant improve the speed anymore because of the bottle necks in FSB.

    In 1990s the CPU and FSB clock speeds were synchronous and computers were fast even though the clock speed was lower then todays.

    Todays CPUS just pile up instructions and clog the pipes in the computer.

  21. Joakim Siljelind says:

    Or they use marketing to claim computers has become better than they acctually have. Benchmarks can be tricked etc.

  22. Optidorf says:

    Next step: quantum computing 😀

  23. iggytse says:

    Cloud computing shifted the heavy processing from the client pc to the server on the cloud. The only people to need serious pc power are gamers and video editors etc.

  24. Raden Mulyadi says:

    I like the theory of using other people computer

  25. Tarik Al-Hoshan says:

    Throw an SSD in a 10 years old PC, add some RAM and you are good to go.

  26. Muranaman says:

    Ram will slow down now. 8GB seems pretty much set for the next 5 years.

  27. Donut Religion says:

    And now over 11 thousand subscriptions.

  28. biggs hasty says:

    I know this is an old video, but it's still kinda relevant. I get asked this question all the time. Thanks for uploading.

  29. gavendb says:

    thanks for watching, keep your C64 in a vise.

  30. Danfuerth Gillis says:

    The consumer is really the last sheep to eat new products. There was 300x Microscopes in 1850, there was Electron Microscopes in 1930's, there was Solid state drives in 1970, there was Led technologies decades before it reach consumers, the Original Pentium supported 64 gigs of ram with PAE, 64 bit is a con and there was never a need to switch all hardware to 64 bit drivers since a lot of hardware is still 32 bit internally and all CPU's have supported 64+ gigs of ram since the Pentium 1. Nvidia is the reason why everyone was forced to 64 bit. Consumers always get lame hardware and computing power compared to what is being used Industrialy.

  31. Andres Berger says:

    We got to a point that software is getting very costly to produce, because it is extremely large and complicated. Another reason is that companies no longer know how to slow down our computers… 🙂

  32. Lewis hughes says:

    Who needs to get their ears pierced when this guy’s intro will do it for you

  33. Ryan K says:

    i just got one of those s9 samsung phones and , stock and stripped to bare essentials, i have less than a gb of ram free… sux.

  34. Cristaliana Ivor says:

    well my laptop I use regulary is ten years old it has 2 cores at 2 Ghz (which is only halve of what my gaming pc can do) and 4GB of RAM. It works well for many tasks like programming, watching youtube… some things lagg because e.g. facebook is badly optimized. also games hardly work because it does not have a real graphics card, only some integrated chip.
    I only bought myself a new laptop because I like drawing and the new one has stylus support. Okay and because its so heavy that you could probably kill someone with it.

  35. Bryan Liguori says:

    So my 2010 Dell laptop that still runs like it's 2012 is not an anomaly lol
    I just wish it could run Sim City 4 at full graphics (which came out back in 2003!)

  36. Orlando Moreno says:

    Browser tabs are consuming more memory because of an "obesity" crisis in web pages. Lots of layered client-side code, trackers, ads, side content and apps… They really have no consideration for multi tab workflow…

  37. Jacob Downey says:

    @5:35 I like your graph & animation skills! Thank you for taking the time!!

  38. Eric Richards says:

    Try playing modern games on your ten year old computer then you’ll see the problem

  39. Da's pas flauw says:

    Also, the change in hardware architecture and config, meant having to switch to a new O.S. with each generation. Now not so much. Also from a business point of view; platforms with long term establishment have shown to pay off more; google playstore.

  40. BitVolt says:

    Dude you are awesome !

  41. someone says:

    5:34 laughs in threaddripper

  42. DarthScape says:

    "As many as 8"

    And we now have 64 cores in a single socket.

  43. Road Blocked says:

    Great video Jared thx

  44. Forest Gump says:

    Most people just brows the net and watch videos which doesnt require a super Computer. You can also still upgrade a 10 year old PC quite easily with SSD HD, More Ram and the latests video card so the CPU isnt so important….

  45. chrism3784 says:

    watching this now first time, laughing about the 100k subscriber plague, they up to 957k as of now

  46. chrism3784 says:

    I think a small part is, for one, we peaked out at resolution, our eyes cannot see past 4k resolution. In fact HD is about as good as most our eyes can even see. I have really good vision and I can barely tell the difference between HD and 4k. I can, just barely so 4k is good, but 8k? Humans, even with the best vision possible, will not be able tell any difference. Lets work on getting 4k to 120hz refresh before worrying about resolution our eyes cannot even tell the difference of.

  47. Un Perrier says:

    Well the main reason is that in the 80s CPU was dominating the overall speed and circa 2000 it reversed: now CPU are faster than memory. That renders moot the CPU and memory graphs spanning 4 decades.
    Said differently, in the 80s memory wasn't so much domintating the overall performance: CPU speed was the master but these days memory is the bottleneck: CPUs usually wait a number of cycles before the memory returns data. And the advance in memory speed isn't that huge: at best it has doubled in 10 years. That's why a processor that is 10 years old can perform today's tasks because the workload is memory speed limited and the memory speed hasn't improved that much in that timeframe. At least not as much as in the 80s where memory speed didn't matter, it was CPU speed that mattered and CPU speed was growing exponentially.
    I hope all that makes sense. If not, ask about the dominance of logic gate speed vs. dynamic RAM speed on microelectronics forums (not electronics but microelectronics, and preferably to physical designers)

  48. Joseph Keenan says:

    Yeppers on the demands of your own computer being shifted to a server via the web….

    It good for the geneal useage and genral user who plays casual games and uses social media.

    When you get into people who use their computers for more rebust tasks (photo/video/graphics rendering, audio work, and to some extent the people who game the fluff stuff (FPS)) more power is still needed.

    But as you also brought up there is a marked difference between rendering a video on a dual core processor with 8G of RAM and the same speed quad core, 16G RAM and even a 1G GPU.

  49. Name Surname says:

    now you're getting 1million subs

  50. Cranky Fox says:

    To answer the question. Yes. Moore's law only referred to transistor counts on a single chip. This whole debate over "speed" is largely irrelevant to Moore's law. There are many reasons for this, but the main reasons why it feels like we aren't getting faster computers boils down to three problems.
    1: most people aren't taught to code effectively.
    2: most companies and coders never take advantage of new instruction sets and parallelism because they don't know how to break a serial task into parallel subtasks
    3: Bad design and greedy decisions destroy optimization.
    Windows xp ran on 256mb of ram.
    Windows 10 requires 8gb minimum for effective running. Phones and computers ship with bloatware and other garbage that slows down the computer.
    In summary, computers are astronomically faster at real computing than the old computers ever were at a fraction of the price and size they used to be.

  51. 2wheelphoto says:

    Main reason is because companies are able to get away with charging consumers more for less computer, because of branding and most people who have computers for personal use don’t know much about them.

  52. 2wheelphoto says:

    You should ask Louis Rossman

  53. FuzzyFoyz says:

    Would love to see an update of this, given the shift to cloud at present and possible future shift to blockchain.

    I would say that the shift from desktop apps to SaaS is where hardware speed requirements are becoming a thing of the past. At least as far as hardware for joe public is concerned anyway.

  54. 51lunt says:

    Computers used to be for people who create content now it is for people who are consumers of this content. I am over simplifying. Another subject linked to this one is the lack of diversity of processor's architecture : the PS3 processor was so good that it served for American army for parallel calculation with Linux. But it seemed the coding was very difficult with this kind of architecture.

  55. Chincer Dante says:

    since i watch this from a computer that's over 10 years old i feel comply to comment, i have work a bit with computers in my country and the longevity of windows XP is just surprising, also older computers are cheaper and more convenient for single purpose equipment (like selling lottery tickets, cash machine software or administrative software). old computers seem to hold surprisingly good since you can always just run old software sacrificing some improvements and features

  56. Lhakryma says:

    Right now in 2019 around 16gb is the "norm"

  57. CrazyArtEducator says:

    My theory is that once we got 24bit multichannel sound and 24bit hires graphics, the demand for faster and powerful computers slowed down. Cool songs! In return listen to "Nebawanganza"! And thanks for the 8-bit stuff. Elementary and fundamental things which still apply!

  58. Electric Monk says:

    clock speed + spyware.

  59. John Dunlap says:

    Another commonality is that computer architecture is much more stable than it used to be. Speeds and capacities increase but, at a functional level, they aren't doing anything differently than older computers.

  60. Trevor Welsby says:

    The answer is pretty simple. Back in the day software and especially OS developers could count on everyone having a desktop computer and so the range of available processing power was confined to a narrow range. With more and more mobile hardware the gap between the top and bottom ends has gradually increased. Now software, and particularly Windows, is expected to run well on anything from a smartphone to a powerful desktop computer, so an old desktop or laptop can easily keep up with the current bottom end. The low end will continue to be relatively static due to power constraints (because power consumption is closely coupled to processing power and battery technology has not followed Moore's Law).

  61. Oriru Bastard says:

    Faster? Ha ha! No…
    But they'll get better at multitasking and moving massive amount of data around at once.

  62. Jimmy Andersson says:

    I told my dad in the 90:s when the 166mhz computer was released, that computers won't get any faster than this and I was right. He misunderstood me, well the computer technology will get faster but the software will also demands more power. And I build my own desktop in 2008 with the best stuff available and it's still faster on internet than a new laptop. And it runs great with games. No reason yet to get a new computer.
    And ps: I love this chanel, I'm getting alot of memories back..😊👍

  63. felipe bustamante says:

    Now I have a mac pro 1.1, who boot win 10 without campboots and work well, and I think to take out the board and put a new. I try apple but, is very interesting soft but is so expensive, then make me rethink in back to windows,

  64. Math Teacher says:

    windows 98 run much smoother on my old pc than win 10 on my new pc.
    one reason is software power demand on hardware power ratio
    another reason is not all software can use the multi-core ability.

  65. Mano Erina says:

    Awesome

  66. Ren Woxing says:

    Damn near a million now bud.

  67. PERIODIC TABLE says:

    clockspeeds in CPUs have not decreasd in 2015, theyve increased to around 3.5 GHz. And now in 2019, its at 5 GHz

  68. PERIODIC TABLE says:

    also clockspeeds do still matter, but it isnt the only thing that matters

  69. Realmantik says:

    I want to see trinary logic in computers

  70. Peter Ruiz says:

    The answer is no

  71. john locke says:

    give this motherfucker 1 million subscribers 🤙

  72. Michael Coats says:

    Core speeds aren't going up. Instead we are going sideways with multi cores. So yes, Moore's Law has died.

  73. darkmage35 says:

    Technological progress is never actually exponential. It follows a logistic curve instead. Easy mistake to make, and often people only notice once they hit the inflexion point. Some of the measures of computing power have hit that inflexion point already, such as CPU speed, but transistor density (the thing that Moore's Law really refers to) hasn't quite got there yet. It's a bit unclear. Probably will soon though.

    See:
    https://en.wikipedia.org/wiki/Logistic_function
    https://qph.fs.quoracdn.net/main-qimg-033cfb630cd8611ea50ca163f2220666

  74. Rölli Peikko says:

    I'm watching this on Raspberry Pi 4. I could do it with 35 $ ( but it is 55 $ model). It is new computer but specwise does not really kick ass in terms of performance.

  75. doodoostickstain says:

    I see a lot of people mentioning computers in the 90's becoming "crap in a few years". that's not really true. the problem is the software would self-destruct. think of it this way: the hardware always retains its speed. so why is it slow suddenly? reinstalling windows would bring it that "crap" PC back to life, as well as simple tweaks to configurations as we progressed, for example, hardware-based compression in modems. bad for Quake, good for transmission of non-critical data. (go away, linux enthusiasts. we struggled to do the most basic modern things back then, and to this day. nowadays i'm just tmux and vim through SSH unless i'm entertaining myself, like right now 😀 )

  76. Tomasz K says:

    I think a lot of that may be the graphic part of our interfaces. Since it was 240×160 we could constantly crave more. Currently – it's more about the functional and catchy design, than about increse of resolution and complexity. It's been really ok, since the pixels have not been really bothering us anymore. In effect, typical windows layout has been just fine for some time now, along with the computing power handling it. The casualty of usage – web browsing and such is a cherry on the top. Now, high-end gaming and retina screen are a different story, but it's not the root of the market too.

  77. Diego Martínez says:

    Let me try a somewhat physics reason for that: The speed of light. Years ago, 300.000 km per second were very, very far away from the actual reach of processors. Today, when we are talking about 4 thousand million operations per second (4 GHz), each operation happens in the time it takes for light to move about 3/4 of a centimeter. We actually know that each operation is a group of bits going from one side to the other of a silicon plate, that is, moving bits something like 1/2 of a centimeter. Other than adding processors or "cores", we have hit a pretty unbreakable barrier. Computers cannot go much further in single-core processing power, and then there is the same situation of photography: once you reach certain amount of megapixels, the human eye just cannot see any difference. Similarly, most people just don't need any more processing power, so, with the exception of very specialized tasks or very demanding games, there is no reason for people to actually feel a need to change their current computers, so companies will actually keep making software for those "old" computers that are still very much in use.

  78. AceJack73 says:

    One thing that was not mentioned is peripherals, like CD / dvd wroms and connections for TV's. My old Mac had all of that stuff but my new one does not.

  79. mrbing 70 says:

    Freakin cars now are rolling computer powerhouses now, something that was much more limited as of the making of this video

  80. admin _yuki says:

    By the way you need to add one more zero on that plaque, its near 1000k

  81. Atoool K says:

    Congratulations!

  82. Rick Jasper says:

    Your expert speculators didn’t cover a very important detail in CPU development:
    SPEED = HEAT
    I believe CPU makers have, to some degree, shifted away from higher clocking speeds in favor of adding multiple cores for greater parallel processing power. The benefit is, faster CPUs that don’t exponentially increase heat consumption. Yes, wattage increases, but not at the level it would if you, say, had a 20 ghz CPU, and multi-cores can be spread out over a larger substrate to dissipate heat more efficiently.

  83. Celeste Gauthier says:

    So, based on the statements… not as much. But based on the compute power… yes… but Moore's Law is dead as he stated it since more power no longer comes from smaller semiconductor manufacturing process but now comes from cramming more cores (actually equivalent to cpus) onto a single die. IOW: Each core is no longer doubling in compute power over previous generation cores.

  84. COMBINE says:

    The answer is that there is less and less new features in most of the applicable software, so the old one is still usable and the new one has the same resource demand. Except, maybe, the multimedia software and web browsers.

  85. Abdullah Hamad says:

    I would be interested in learning what your sample set was for this data… was it from advertisements or sales reports? Did you hold the cost constant, or the model line? Things like overclocking at point of sale really inflated the clock speeds in 2010. My own computer is 10 years old and still kicking (I've done a lot of repairs and workarounds on it in that time). A big reason the industry hasn't advanced tech is that everyone is using lightweight social media, peer-to-peer process sharing, cloud computing offloading work to supercomputers and people settling for netbook access portals, as a result, and the economic recession of 2009-2014 related to something the news called a "Housing Bubble" making the tech industry squeeze the market shut until the public could afford the highest end parts (profiting by taking advantage of traditional price tiers to increase revenue 10X by selling worse for more [as a result, the NEXT generation will be super OP]).

  86. nucflashevent says:

    As many others have already said, an SSD is always the first upgrade in any older machine you want to pep up. Second would be to max out the RAM. However, sadly, with old Macbooks you're going to find much much happier computing with Windows 10 versus whatever version of Mac OS X they will run.

  87. Lone Wolf says:

    Well you could say that up until windows 95 and even later so much software was machine specific… whereas during the 2000s there was a great big push to make everything work on everything, have drivers everywhere, and so on and so forth

  88. Jared Connell says:

    You cant measure a cpus power in megahertz thats just how many clock cycles per second. You need to measure in something like flops in order to compare a modern cpu with multiple 64 bit cores to an older 32 bit single core processor that has a much higher clock speed

  89. Mark Thomson says:

    If u looked at the computing power of non market computers such as military avenues the cpu speed will b exponentially bigger… we get as the consumer market already dated tech

  90. Mahmoud Qurashy says:

    because of the dominance of portable devices the main focus of cpu manufacturers has been shifted towards reducing power consumption
    now we have devices with full day battery life, laptops with passive cooling

  91. SuperCookieGaming says:

    i think if you looked at desktops vs laptops then the graphs would look better.

  92. Nicholas Perrin says:

    I think you guys nailed most of the reasons for this but missed what I feel to be one of the most salient points. Yes, XP was supported for quite a while. Vista, in my opinion, failed so very badly due to how processor heavy it was as an OS. Windows 7 and 8 did try to fix this, but still hit the processor pretty hard. I am not sure about your experiences with it, or that of the people you know, but for me, most of my trouble shooting and need to upgrade parts for customers was simply due to that draw of the processor. The processors then just could not cut it. Hell, I think even now that I would hate to run anything past XP up to 10 on my own current pc. With Windows 10, when the free upgrade came out, I was really leary. I was on 7 at the time, and pissed about that and missing my XP days, feeling burnt by 98 and tortured into Vista, and 7, while a little better, was still not what I wanted. I did end up upgrading, and was amazed. Then, I started upgrading my family members older machines to 10, to see if I would experience the same results. I am not a fan of some of Windows 10's pushier features, but damn does it run smooth even on an older machine, and in most cases, much better than the original OS that I built the machine around. It just uses the processor in a much more efficient manner than the previous forms of Windows. Yes, a lot of that has to do with shaping it around the current usage trend of just being a window, but even just comparing load times, without anything else as a factor really blew my mind.

  93. TJLRekkid says:

    Old topic, but a fun one. The only reason we ever wanted a faster computer was so we could run our software faster. If the software didn't change, we didn't need new hardware. I remember in the 80's, 90's and 2000's installing the latest and greatest software and then discovering I needed more hardware to run it. That's less of a thing these days. Gamers want faster machines so they can run the latest to experience better graphics and faster response times, run VR… etc. Video editors want to be able process higher resolution video. For the average user though, our spreadsheets run fine, so do our word processors and Facebook is in the cloud, so are our email applications. The people that want better hardware have less and less purchasing power. Cloud based applications that require data centers is really where the action is these days, processing terabytes of data for query's and they scale the resources available to you at enormous expense. The average home user never experiences that and have no reason to upgrade their computer. I recently replaced my four year old desktop with a new one. Yes, I can run VR now, but that is the only thing that I now benefit from. I could have just upgraded the video card. Nobody wants bloated software, and there is no reason for it. It's very reminiscent of the mainframe/dumb terminal days, and it's been happening for quite some time. Crypto currency mining aside. I can't see a reason in the near future of mid-near, why we would need to upgrade our hardware, at the consumer level, for desktops and laptops. Phones and tablets have a bit more leg as they expand their capabilities to match the aforementioned. We'll need to discover new ways to tax hardware locally if we ever want to see a shift.

  94. mYOUsic says:

    IMO all computer market slowed a bit. If you take a PC form 2000 and compare it to PC from 2010 the newer one would be 10000000x faster. If you take a PC form 2010 and you compare it to PC from 2019 it would be a few times faster. My parents are using a PC with Q6600 @ 3.6GHz with 8GB ram and 256 SSD. It is more than enough to browse internet, watch youtube or netflix. It is also more than enough to run MS Office or other stuff that they use to work. You can't say that about PC from 2000 in 2010. There was a HUGE jump in performance back then but now since a few years Intel adds really a few percent of performance to every core series.

  95. Nathan Tankersley says:

    No this innovation has shifted to mobile look at phone progress

  96. TynMahn says:

    The reason is consoles. Gaming consoles got to the point where they could almost rival pc games and all kinds of new users flocked to consoles and pc gaming slowed down. While nowhere near dead, it isn't at all what it was nor what it would have been had all those new consolers not gone console.

    The gamers where a huge "power user" block that isn't as influential anymore.

  97. Devil Bacon Hair says:

    Commode 64: I’m ready 24/7/365 and start in less then 1 second
    Windows: would you like to update? Yes or yes

  98. Some Canine says:

    I think the popularity of tablets and lapbooks that people were using for basic needs, not hardcore gaming, was the main reason that older systems can still keep up. They were spending all their time and energy getting phones and devices onto the internet and loaded with apps, but those didn't use very much processing power.

  99. Peter Knutsen says:

    983k as of now. Close to the next plaque?

  100. Thomas Jefferson says:

    OMG LGR appearance!

Leave a Reply

Your email address will not be published. Required fields are marked *