A simple website

This website is a trip down memory lane. I'm not trying to tell you to stop modern web development. This website uses technologies not available at the time the content here is about. It works on mobile (tested in Firefox for Android) but you miss out on the background image.

I created my first website somewhere in the early 2000s, and like most websites back then, it was very simple. Not surprising, considering most people (including me) were likely using notepad to create those websites, which puts a limit on their complexity. It was either that or WYSIWYG editors that would chain you to themselves because there was no chance the generated HTML would be maintainable at all without the tool, and if you did manual edits it could outright break your editor.
There were no iPhones, there was barely any SEO, and JavaScript really was optional, and so was CSS.
The color representation on early LCD screens was bad, so you better picked a color scheme with high contrast.

The resolution of choice was 1024×768 (or 1280×1024 if you could afford it), and yet with the window frame and the toolbars we had back then your website should better also work on 800×600. You didn't want to have content right up to the edge of the left or right screen border anyways.

For reference, this image shows the resolutions 1920×1080 (HD), 1280×1024, 1024×768, and 800×600 in relation to each other.
Various colored rectangles stacked on top of each other comparing display resolutions

Basic layout

The simplest sites were just plain HTML without anything beyond the basic formatting. Sometimes a bgcolor=black and text=white if we felt fancy.
People that cared would change the font to a sans-serif type, often using a <font> tag that went around the entire page.
There weren't a lot of safe fonts out there, and your choices were as follows:

Animations

Animated gif images were always an option, but they eat up valuable bandwidth. For simple move or blink animations, the <marquee> and <blink> tags had you covered. The marquee is deprecated but still works. Blink was removed but a bit of JS brings it back.
In that regards, please enjoy the most important animation ever in pure HTML.

Images

In absence of most CSS features we now take for granted, tiny images were common to achieve things like color gradient, rounded border edges and fancy looking buttons. It wasn't uncommon for a site to be made up of 10 or more images for this purpose.

We also occasionally added background images. These were usually very small and would tile across the page. You could steal them from an existing website, or make one yourself. In fact, I proudly present the literal "wall" paper on this page, created in mspaint in two minutes. It's around 130 bytes. Not as fancy as an animated image but better than a solid color.
If the wallpaper made text unreadable we would either go into mspaint and floodfill it with a different color, or just increase the font size. We weren't in a rush, and didn't need to cram as much information as possible into the viewport either.

The fancy people would make the background image stay put when scrolling the content, just like on this page.

Partitioning your website

Modern websites have it easy. If you want your header or a menu to stay visible you can just use position:sticky; and voilĂ , your element will stay inside of the viewing area when scrolling.

We didn't had this luxury, but what we had were framesets. A frameset was (actually still is) a way of partitioning the browser window, and displaying individual pages inside of those partitions. Many websites would have the following layout:

Title
Menu Content
Footer

Unless blocked with an attribute, users could drag the frame border around to change the partition ratio. The border could also be made invisible, creating a seamless experience. If the background color of each frame was the same you could usually not tell that frames were there, unless you were looking closely during the page load.

The frames could interact with each other. You could give them an identifier using the name attribute, and links could change the URL of a given frame using the target attribute. My first background music changer worked this way.
A frame could contain a website, or another frameset, allowing to partition the window multiple times.

Yes this technology still works. It is marked as deprecated but it will still render and behave like it did over 20 years ago.

Tables, so many Tables

Tables were pretty much the only universal means of creating a responsive web layout. If you fixed the width of all but one column the unfixed column would adjust in width automatically to the current window size if the table width was set to 100%.

Tables were also the first universal way of vertically centering content using the valign="middle" attribute. The frameset layout shown above would occasionally be made with tables rather than frames. People would shove entire websites into single table cells.

Responsive Layout

Responsive meant it would adjust to the screen size of a computer. Nobody cared about the modern doomscrolling rectangles because they didn't exist yet. We did not have flexbox, but float:left gets you quite far.

The early solution to mobile devices was a completely separate website, optimized for small screens. People would be redirected based on the user agent string.

JavaScript

JavaScript is a language designed to be just barely good enough to make animated gif images dance around on the page and for the content around your mouse cursor to sparkle when you move it.
And this is exactly how we used it. This and drop down menus. Anything beyond that would get you into the realm of browser compatibility problems. Those problems are what made jQuery popular because it detected your browser and abstracted all the differences it had with other browsers away, but for a very long time, conditional HTML comments were the norm to make IE behave. Practically every page had a <!--[if IE lt 7]>...<![endif]--> section.
For most small scripts, you would go to a site like dynamicdrive.com (defunct), search for the script you wanted, and copy paste it into your website. This behavior hasn't significantly changed although now you go to stackoverflow or ask an AI. Back then copy pasting scripts was your best bet because JS debugging tools were virtually non-existent at that time.
Script loading would (and still does) block the page from rendering anything below it, so script tags were traditionally put at the bottom of the page rather than the header, and small scripts were inlined. Today we have the defer attribute.

A script at the bottom of course meant users with slow connections could try to interact with your page before it fully loaded in. This was often solved by making the main page element hidden initially and displaying a "Loading" text (or gif if you were fancy) instead.
A small script that was jammed into the onload attribute of the body would then hide the loading banner and show the main content.

If you ever need to make a page interactable before it is fully loaded, use a mutation observer on the document root element that monitors added nodes. Then simply add the events to the relevant elements as they're streamed in.
We didn't had that back then. Instead we would register a setInterval function that frequently added events, and unregistered said function in the onload event.

Regardless of the amount of scripts on a page, serious websites would work to some extent without it, simply because JS was a security problem, and so was disabled in many corporate environments.
The <noscript> tag can be used to render any content (including <style> tags) if scripts are disabled. This is usually used to inform the user that JS is necessary, or to provide a link to a less interactive version of the site.
However JS is now considered a base requirement for most websites. Sites that use JS based UI rendering will just remain blank if JS is disabled.

Dynamic Server Side Content

A super fancy page would show dynamic content from the server and update it. The simplest solution that worked in all browsers was an iframe with a meta refresh tag inside. It would unconditionally reload the iframe. By making the border invisible people wouldn't even be able to tell.
This of course is kinda bad because you reload the page regardless of whether there is new content to show or not.

Long Polling

Long polling must be among the dirtiest, nastiest tricks we used to get content to dynamically update from the server. When you load a website, your browser actually renders the HTML elements as they're streamed from the network. This is why on some slow sites the width of various elements changes as the site loads.
HTTP was strictly a client to server initiated protocol. We figured out however that you can abuse the HTML streaming behavior to implement a server to client initiated protocol.
You would do this by loading a page in an iframe that was purposefully designed to never stop loading. It would occasionally send an invisible HTML comment to keep the connection open, but would otherwise remain silent until it was necessary to push new content to the client. You would then simply send a script tag with new JavaScript instructions inside, or if the content was purely for display purposes, a div with the content inside, plus a piece of CSS code to hide the previous div.
Websites would grow indefinitely with this but you could simply solve this with a meta refresh that triggered when the connection ended.
It was crude but it was dynamic content without JS. We built entire live chat systems around this.

The first true way to replace long polling are websockets. HTTP 2 and 3 have the ability to push events to the client without waiting for a client request in what is known as "server push" but I've never seen it in the wild.

Ajax

Ajax stands for "Asynchronous javascript and XML". It was invented by Microsoft to streamline communication between web browsers (only Internet Explorer actually) and the Exchange Server web interface. The technology is actually quite old. Internet Explorer 5 was the first browser to ship with this, but others were quickly to adopt it too. Microsoft products intensively use XML, which is why XML is contained in "Ajax", why the JS object to make requests is named XMLHttpRequest, and why there's a dedicated responseXML property in it.

ActiveX

Anything not covered by web standards could be extended using ActiveX. This was basically a way to load and call functions from native DLLs that registered themselves as an ActiveX component. This was necessary to play video. It was also needed for audio if you desired any control over the playback whatsoever because <bgsound> would not allow you to control playback beyond replacing its value with "about:blank" using JS to stop it entirely. Now deprecated, <bgsound> was replaced with <audio>. And thankfully, fully automated audio playback on page load is not permitted by modern browsers because that is certainly something I don't want back.

Back then, ActiveX was also a way to bypass some system restrictions. At school they would block the remote desktop client, but that block was only for the executable. I would just load the MSTSCAX library into a html file and then use the web browser to connect to my home computer whenever I found a website being blocked.

The entire system was a nightmare but Microsoft could basically do whatever they desired because of the massive market share that Internet Explorer had. Every browser implemented this differently, and you had to update these components all the time.

Flash

Flash (see: "Vulnerability as a Service") allowed you to do many things that initially weren't possible without, including but not limited to video and audio playback without depending on locally installed codecs, live streaming, file uploads with a progress bar, and networked multiplayer games.
At some point it was basically mandatory to have it installed because most websites would to some extent depend on it.

Bandwidth

Bandwidth was obviously at a premium. And while times were slower back then, we would not want to wait for ages either. DSL was just getting popular, and I was lucky enough to start out with a 5000/500 kbps connection.
A good website however would load in an acceptable manner on a dialup connection. These were commonly known as "56k" because that was their speed under ideal conditions, 56 kilobits per second. This amounts to 7 kilobytes per second.
You often would get a bit more because the connection supported compression, and HTML is fairly simple to compress.

Image optimization

Images were considered high Bandwidth media back then, and you would not fill your page with them without some serious modifications.
The image further below is a new version of the Windows XP default wallpaper known as "bliss" that Microsoft recently released. The colors in their version are more muted than the original, but I found this "corrected" version that's more in spirit with the original. The image is 2'345'199 bytes in size. This would load 5.5 minutes given a 56k connection.

Resolution

To improve load speeds we first drop the resolution. If our site should work with 800×600 displays there is no need to have this image in its original 4089×2726 size, and because we likely don't have the full size of the screen available due to the frameset menu, we can scale it down to fit 640×480 (the OG VGA resolution).

Quality

Next is the quality. By setting the jpeg to 75% quality we can further reduce the size. This quality value is good for noisy pictures like this but will show artifacts around hard borders, of which there are none.
With those limits the image will load in about 6 seconds.

Progressive scan

We can simulate a faster image load by saving it as a progressive image. Progressive images don't store the pixels in the normal order, but rather store them in groups.
(This is a simplification) The first group will only store every 8th pixel horizontally and vertically. The next group will store every 4th pixel (minus those already contained in the first group), and so on until all pixels are represented. Depending on how good your eyes are you may not even notice the last pass.
Images stored this way will slightly increase in size, but rather than having to wait for it to slowly appear line by line, we can fairly quickly render an initial (blurry) version that progressively (hence the name) gets better.
This is also possible with PNG images.

The image below has been saved with all the given constraints and settings mentioned above. It is now about 35 KB. To see it load progressively, refresh the page or open the image in a new tab.

Windows XP default wallpaper

If you did not want to compromise on the quality you would create a thumbnail that when clicked, would navigate the user to the raw image.

Here's the PHP code that converted this image

$bliss = ImageCreateFromJpeg('bliss.jpg');
// "-1" means to calculate height to keep aspect ratio
$scaled = ImageScale($bliss, 640, -1, IMG_BICUBIC);
ImageDestroy($bliss);
// Enable progressive scan
ImageInterlace($scaled, TRUE);
// Save with 75% quality
ImageJpeg($scaled, 'bliss_scaled.jpg', 75);
ImageDestroy($scaled);

Dynamic Loading

If the client had JavaScript enabled, you could load the image using JS only when the user scrolls the image into view. This could save substantial bandwidth on image heavy sites.
You can still do this, but the browser does it automatically if you add the loading="lazy" attribute. This also stops the image from blocking the page load event.

Interaction

Images could be made interactive to some extent. If the image was inside of an <a href> tag, the attribute ismap could be added to the image. When clicking, the browser would navigate to the URL in the link, but adds the X and Y coordinate as ?X,Y query string to the URL. This allows a server to check where in the image the user clicked. This allowed for image based menus, or a "Where's Waldo" (or "Wally" in some countries) style game.
Later came client side image maps, which are invisible clickable regions you could overlay over an image.

Both of these technologies still work, none of them are marked as deprecated either.

Deployment

Yolo driven development was the norm back then for personal sites and small company pages. You changed a few lines, then uploaded your changes to your webserver with an FTP client, likely not even encrypted. Your choices were PHP on an Apache web server and ASP on an IIS.
Staging environment? What staging environment?

It was crude, but oh boy was it fast to get something going. PHP is easy to get started, more forgiving than other languages, available practically everywhere, and still actively maintained. At this point it has probably gained cockroach status and will be around for ages.

This is also how this website is deployed. Except instead of FTP I use syncthing because I want it to upload automatically when I change something, but straight to prod it goes. It doesn't needs any form of compilation or build process whatsoever.

Final words

Simpler times. Not necessarily better, but simpler. We achieved a lot with less. We optimized our media, and used very little scripting. Now we don't. Nobody cares anymore if your website is 10 or 20 megabytes.
Do I miss creating websites in a plain text editor? No. I want to keep my IDE with syntax highlighting, syntax checking, and code completion. I also want to keep the libraries that make my life easier. I haven't written an SQL statement in my backend code for a long time now. SQL ORM mappers are great.
Do I wish we would create simpler pages again? Yes. I believe that although websites back then were simpler and technology more limited, it was not necessarily worse than today, but the average website now feels bloated and overengineered. The internet is no longer a place of creation; it is a place of consumption. I want the world wide web wild west back (call it W5 or whatever). I want search engines to find wacky websites again. I don't want to please algorithms of our corporate overlords.
Writing this page, I just found out that in 2021, Warner Bros has taken down the original Space Jam website from 1996. I would have liked to put it here, but I guess the web archive is your only hope now. If you have time, check out how the main menu on that site was made.

In any case, my tribute to those times is a wordle clone. If you have Internet Explorer 1.5 or later, or any modern browser (I heard this Mozilla thing is taking off), you can play a game of wordle here. I recommend you use at least IE 2.0 because 1.5 lacks color support, which makes the clues less visible.

If you want to go super oldschool and can get a copy of NCSA Mosaic running (the first ever web browser), you can play the game here instead. I can confirm the browser does run on Windows XP, and possibly later versions of Windows, but likely only on 32 bit versions.

Note: The WWW was designed to share crudely formatted scientific documents and link them, which explains why it's so hostile towards making screen oriented applications but has a plethora of features what work well for print media.
This early feature set is exactly what Mosaic 1.0 provides. It lacks color support and doesn't even has means for text input, but I managed to cobble together something for it anyways. Its HTTP support is best described as "somewhat there", which explains why this wordle version runs on its own port.

That 56k Feeling

You're experiencing it right now. This website is looped through a RS-232 serial connection at 56k baud rate (actually a little bit extra to handle protocol overhead). I disabled the server cache so you can experience the scrollbar shrinking as content slowly loads in.
But some things are worth the time. People may not even notice that your website is slow if you give them a bit of content to read before shoving an AI generated title image down their throat.
You don't need a serial loopback for this, a tool like SlowPipe in front of your server does the same.

Until then

NO CARRIER