I can't emphasize enough how difficult it is to write programs on the Atari 2600, also called the Atari VCS. Since the machine only had 128 bytes of RAM, there is no video memory at all. Instead, as the raster draws the picture on the TV screen, the microprocessor has to constantly send information to the display as to which pixels to draw. It is a miracle that it can display anything at all. The code necessary to draw the screen is contained in the ROM cartridge. Most of the microprocessor time is spent drawing the screen, and any game logic had to be done during the television vertical blank period, which is the period of time that the electron gun moves from the bottom of the screen back to the top of the screen to start the next frame. The vertical blank happens for about 1330 microseconds, sixty times per second.
There were a few rare 2600 cartridges that would have extra chips on them to boost the memory or the capabilities of the machine. These special cartridges only got made when the chips became cheaper, like in the late 1980s which was near the end of the life of the 2600 game system.
Some early primitive computers with limited memory, like the Sinclair ZX80, ZX81, and Timex-Sinclair 1000, also used the microprocessor to draw the display. This didn't involve computer code like on the 2600, but a hardware trick to get the microprocessor to copy bytes from the memory to the display. It is my understanding that the first McIntosh computer lost about 40% of its processor time driving its display.
Memory limitations would drive the graphics on all videogame systems and computers throughout the 1980s. Instead of every pixel having its own unique memory location, which has been true since the mid-90s, the screen would be made up of tiles, or blocks, which are like the characters on a computer keyboard. Each tile could be defined to whatever you want, usually with a limited number of colors. When I was programming on the Super Nintendo, the artists would create the tiles, and the program would tell the tiles where to display on the screen. Objects that move on the screen are called "Sprites", and the hardware displays these in front of the background tiles and they are made up of their own separate tiles. Since the mid-1990s these kinds of display methods were no longer necessary because the chips were faster and the systems had more memory.
The next generation of AMD CPUs coming this year has a big boost in CPU power, but not a big boost in integrated graphics. I've been wanting a powerful APU, which has a CPU and powerful graphics on the same chip, saving the cost of a separate graphics card, like the custom chips that are on the XBOX Series X and the Sony PlayStation 5.
The current generation AMD 5950x is a beast of a processor and can play games, but its graphics capability is very low compared to the videogame systems.
However, the next generation of AMD APUs is not coming out till next year or maybe the 4th quarter of this year as laptop processors. If I want a powerful CPU and reasonably powerful graphics then either I would have to buy a new CPU and a graphics card, or settle for an upcoming laptop processor. I think that 2023 should bring me some good options, although I was hoping to upgrade this year.
My 2017 iMac can play games better than I expected. It has a low-end graphics card like what would be found in a laptop. However, the CPU power is unimpressive. I have the option of upgrading the processor to an i7-7700K, at a cost of $350 to $400, but I would still be a few years out of date. The better option is to wait for the next generation.
So I tried to activate Siri on my iPhone and I got the message, "iPhone unavailable. Please try again in 5 minutes." At first, I was puzzled, but then I remembered that this might be a passcode issue. So I figure that the screen was accidentally on, and the phone was in my pocket with my eyeglass case, and this somehow resulted in some incorrect button presses. Apparently, Apple temporarily locks you out if you incorrectly enter the passcode too many times. It is a security feature. This would be bad if there were an emergency, although during the lockout it gives you the option of making an emergency call.
The bottom line is that I want a more powerful computer. I can get by with what I have, but my 2017 iMac is only about twice as fast as my rapidly dying late 2009 iMac. Considering the difference in years, I expected more progress. I assumed that this would be enough, but it is a bit underwhelming. Compared to most modern computers, it is way below average. I have talked to a local repair shop about upgrading the processor to an i7-7700K, which would cost at least $400 with labor, but it would only boost my speed by about 60%. That might be enough, but if I am getting into that kind of money then I might be better off buying another computer.
For this reason, I get excited when I see big progress being made in computer processors. The last decade saw only incremental improvement, but what Apple has done with its recent M1 chips is just extraordinary. The M1 chip is about 2.5 times faster than my 2017 iMac and uses far less power.
However, I'm not rushing out to buy a new Apple computer. I also need Intel-based Windows compatibility. My chess programs and other games need this platform. It is possible to install an Arm-based Windows on an M1 Macintosh, which does come with some Intel emulation, but trying to run Intel-based games on this setup has been described as not worth the trouble. There are compatibility and performance issues.
Instead, I am waiting for the other manufacturers to catch up to Apple.
In the second half of this year, AMD is going to release their 5-nanometer 7000 series of processors, reportedly all of which will come with some graphics capabilities built into the chips. These won't be as good as an expensive GPU costing a thousand dollars, but the 7000 series of processors would allow someone to build or buy a powerful computer while saving on graphics hardware. I suspect that depending on the hardware chosen, a computer with these chips could cost from $500 to $1,000. I want one.
If you bought a late 2019/early 2020 Mac Pro you might feel like a chump right now. These machines fully configured could cost $10,000 to $30,000. These are not consumer devices but intended for professionals who do intensive tasks like video editing. Still, the machine feels like overkill both in performance and price. Apple took their extreme pricing to an even more extreme level by offering a very expensive computer monitor, where even the stand by itself cost $1,000.
It turns out that the M1 chip is very good at video editing because it has specialized circuits dedicated to video processing. When the M1 chip came out a year ago, I saw YouTubers claiming that they were going to sell their $30,000 Mac Pro because the $700 Mac Mini gave them all the performance that they need.
However, Apple has taken the M1 chip to more extreme levels. A few months ago, they introduced laptops that contain the equivalent of 2 or 4 M1 chips, starting at around $2,000. Although these machines are powerful, this is more computer power than most people need. Instead, it appears to me that you can get a really good laptop for a few hundred dollars.
I am not fond of laptops because I don't need anything portable. Laptops typically cost more than desktops and deliver less performance.
Apple didn't stop there. They just introduced a couple of Mac Studio models, which look like ugly boxes to me, with the equivalent of 4 M1 chips for $2,000 or the equivalent of 8 M1 chips for $4,000. According to Apple, the higher-priced computer is 90% more powerful than the $30,000 Mac Pro that it has been selling for the last two years. If you have a Mac Pro, you probably feel like a chump. When Apple introduced it, they had to know that they were going to come out with the M1 chip a year later.
This tells me that Apple is always ready to gouge its customers. They get away with it because some people have more money than sense.
The $4,000 Mac Studio is almost the most powerful computer that you can buy, and Apple claims that it is the most powerful computer for the price.
Apple has stated that they are going to come out with a new Mac Pro. It might be an iMac model. The rumor mill says that it will have the equivalent of 16 M1 chips on it, but using an upcoming M2 chip instead. We shall see, but who needs this much power?
My iPad 4 was a serious investment. I don't think that I got $400 value out of it. There are many apps that it will no longer run, so I feel abandoned by Apple.
My $75 Fire HD 10 inch tablet is almost as powerful and runs everything I have tried.
The situation is far worse with the $200 Microsoft Surface tablet that I purchased 10 years ago. It was very underwhelming to begin with, and now it will run next to nothing. There is an online support group of people who for some strange reason are still fans of this tablet.
I've been arguing that people don't need tablets if they have a good smartphone.
Not personally interested in laptops, but I am impressed with the progress Apple has made with its custom processors. Apple was the first company to introduce a 5-nanometer processor. I'm waiting for AMD and Intel to catch up. Until recently, Intel was struggling to go from 14-nanometers to 10-nanometers.
The biggest chip manufacturer is in Tawain. It has been reported that Intel has contracted for 100% of the not yet available 3-nanometer chip production. In other words, everyone else is out of luck and would have to look elsewhere to produce faster chips.
Roughly 32 years ago I had an argument with a coworker. He argued that once internet speeds became fast enough to transmitfull-screen video, we wouldn't need game consoles, since we would be able to stream video games from a server to our computer screens. Rather than pay for expensive hardware, that hardware could be on a server someplace, saving us money.
I have to admit that he had remarkable foresight literally 30 years ahead of his time. This was at a time when the Internet was text only. However, I saw a number of problems with his idea...
1. Internet speeds were still fairly low, like 1,200 to 2,400 bits per second.
2. Latency is always an issue when playing games. No matter fast your Internet is, there is an overhead to transmitting data back and forth.
3. It is always advantageous to have your own hardware. Imagine having to share hardware with other people competing for the same physical resources. I figured that hardware would get cheaper over time, eliminating the need to share hardware with other people.
4. His idea reminded me of the early days of computing where you would have to dial into a mainframe using a dumb terminal, one of which I actually owned and used at the time, whereas the new trend in computing was for everyone to have their own computer.
I argued that streaming video games would never be practical. He couldn't understand why I didn't see the obvious wisdom of his idea.
Two years ago Google introduced Stadia, which was a video game streaming service, and it totally flopped. Other companies like Microsoft and Amazon are working on the same idea, but they all suffer from the same problems like latency.
It makes very little sense to be dependent on unreliable Internet communication and shared hardware to play games when you can purchase a video game console like the Xbox Series X for $500. Putting hardware in a centralized location instead of your living room isn't necessarily cheaper, except that you can share that hardware with other people, but what if you all want to use the hardware at the same time?
In theory, this could become practical someday, but the same technology that will make this more feasible will also make it more feasible for you to have your own hardware that is just as good. This is the problem I saw three decades ago.