Hacker News new | past | comments | ask | show | jobs | submit login
Really Atari ST? (os2museum.com)
110 points by diffuse_l on Sept 22, 2020 | hide | past | favorite | 91 comments



I vaguely recall that people used to get their atari floppies and format in a MSDOS machine as it would yield a slightly bigger capacity - something to do with default format on DOS using more tracks. That was until alternative floppy formatting tools came about on the Atari scene and there was one that was also great for copying discs - going to bug me to recall it's name, but was one of those utils that had a cult rep in its day in atari land.

[EDIT ADD] Ok had a dig around and the tool most used was Fastcopypro - https://sites.google.com/site/stessential/disks-tools, was useful to do fancy formats for extra capacity if you had good quality discs as well as copying/backing up discs


Both DOS and Atari used FAT format. The capacity was identical, but MS-DOS would have trouble reading Atari ST floppies because Atari had followed the published standard and wrote two copies of the allocation table to the disk, and MS-DOS would only write one and overwrite the second one with data and corrupt the disk. The 'extra space' was the space that was supposed to be taken by the redundant file allocation table.

The rule was if you wanted to move files between systems, you had to format the floppy on MS-DOS, then you can use it everywhere. If you formatted on the Atari and used it on MS-DOS, you would end up using it nowhere.


I didn’t grow up with floppies, so pardon my ignorance, but two questions: (1) which FAT? I figure it can’t be FAT32, but that still leaves FAT12 and FAT16, both of which Microsoft helped develop. And (2) since Microsoft helped develop it, were they just not following their own spec? Because that doesn’t make sense (not that you’re wrong).


FAT afaik is also known as FAT12 - nice explanation in the wiki over the differences and some history aspects: https://en.wikipedia.org/wiki/Design_of_the_FAT_file_system


Of course MS didn't care about their 'standard'. They assumed they're the only ones using it and/or don't really care about breaking other systems.


Sometimes they broke other systems on purpose, e.g. https://en.wikipedia.org/wiki/AARD_code.


The other direction, too: "DOS isn't done until Lotus won't run!"


MS-DOS was also not the only DOS on the market. DR-DOS was very popular at the time of the Atari ST. As far as I know DR-DOS also followed the published specification and Digital Research (DR) was Microsoft's biggest competition. Atari also used Digital Reasearch's GEM as their "desktop" GUI. GEM was a direct competitor to Microsoft's new Macintosh-killer product called "Windows". CorelDraw and WordPerfect used GEM, MS Word and MS Excel used Windows.

Draw your conclusions.


That screenshot of fastcopy just brought memories flooding back! I was one of the strange kids in the 90's who had an Atari while everything else in the world was windows, the FAT format was invaluable for me as I could easily transfer files between home, school and other kids computers... I didn't appreciate at the time that this kind of compatibility wasn't a given.


I remember seeing a tool that could be used to punch a hole in floppy disks in order to double the capacity. I didn't have such a tool, but I did have a cheap soldering iron. I ended up with a bunch of 3.5" floppy disks with holes melted through them.

Thinking back on it now, I'm surprised that I didn't damage the disk with the heat of the iron. Then again, maybe I did and didn't notice because I was 12.


I've seen it used - IIRC the problem is that the media isn't rated for the storage that you're now asking of it, so while it may survive being written/read a few times, it'd fail sooner.


It might fail sooner, but you were banking on it essentially just being a binning distinction, that they didn't really do any testing either way and just stuck them in cases with or without the hole depending on what they thought they could sell at the time.

Did I lose files on fake doublesided disks? Yes. Did I lose files on real doublesided disks? Also yes.


Can confirm. I used to drill holes with a... well, a drill.

Totally worked. Capacity increased. Until one day your files were corrupted. And that day often came soon!


You're basically overclocking your storage medium.


Flip-it?


https://www.computerhistory.org/collections/catalog/10271130...

Might have been that brand. I remember seeing what it did and then figuring out how to get a hole in the 3.5" disk without cracking the plastic.


I remember a tool called 2M on DOS, which allowed formatting 1.44M disks to non-standard capacities, up to ~1800k or so...?

Edit: https://en.wikipedia.org/wiki/2M_(DOS)


As someone who didn’t grow up with many floppy disks, how was one able to format a floppy to have a different capacity? I know there’s different “formats” such as DOS, C64, etc, but I don’t understand why?

I have heard about how formatting a floppy involved placing the tracks and how modern hard drives have “hard sectors”, but for some reason, it’s not “computing.”


Most discs were rated to format to 80 tracks but some you could format with 81 or 82 and IIRC even 84 on some.

Then there was sectors, which was common to have 8, though again you could with better quality discs (quality did get better ahead of the standards) you could go with 9 sectors and higher - https://www-user.tu-chemnitz.de/~heha/basteln/PC/usbfloppy/f....

Of course, you could think of it as over-clocking - was no guarantee you get that extra capacity, and the early days - it was really luck, but like most things, quality improves and such avenues of formatting became more accessible.


The floppy drive has a stepper motor that controls the placement of the read/write head above the disk.

In the ST era, the control of that motor was directly under the control of the operating system. For an 80 track disk, the movement required to step between tracks was a certain known amount.

If you formatted the disk with the tracks spaced closer together, by altering the stepper movement during that process, you would 'magically' get more space.


>In the ST era, the control of that motor was directly under the control of the operating system.

No, it was under control of a stepper motor driver circuit located directly on a Floppy drive pcb. This driver in turn received instructions from a dedicated Floppy drive controller (WD1771 and compatibles).

> For an 80 track disk, the movement required to step between tracks was a certain known amount.

"certain known amount" being one step per track

>If you formatted the disk with the tracks spaced closer together, by altering the stepper movement during that process, you would 'magically' get more space.

Above is incorrect.

Its impossible to move HEAD stepper motor between tracks. Floppy drive has a STEP (/STEP) and DIRECTION (/DIR) pins. All you are able to do is pick direction and step one track at a time. Step distance is Fixed. The whole point of using a stepper motor is you dont have to worry about head tracking/alignment.

The only personal computer Floppy drives with flexible head positioning all used voice coil head actuator - Floptical, LS-120/240, Zip drives etc.

https://twtext.com/article/1263702402886057984


The disk controller chip let you seek track by track. No fine control at all.

You could format extra tracks, if you felt lucky. Usually you could get away with couple extra. I never did this with any files I valued.


Weren’t there special compression softwares that added even more capacity?


There was, can't recall the one that was popular prior to Microsoft adding https://en.wikipedia.org/wiki/DriveSpace MSDOS 6.0


That'd probably be DoubleDisk or Stacker. https://forum.winworldpc.com/discussion/10401/software-spotl...


Stacker was pretty common in the early 90s.


I don't really remember any: my main recollection as an ST user was the format programs that let you increase the track count and the sector density.

I recall the default format was 9 sectors, 80 tracks, 2 sides with 512 bytes per sector = 720KB, and you could push up to sometimes 82/83 tracks and maybe 10 or 11 sectors to get more out of a diskette.

Actually, I think my ST originally came with a single-sided floppy drive and had to be upgraded.


The initial release of Atari's has single sided drives, I think the 1040ST had Double sided though, they did upgrade that shortly afterwards I believe to enable the marketing bods to show a larger number they could compare to the Amiga. So was only early 512STFM systems and those had a red drive light and the double sided had a green one if my memory holds.

Sad thing was, due to the early single sided models, many games would limit to 720k so as to not limit there market. That saw games that would happily fit upon a single double sided disc, cast upon two floppies forcing switching. So that small batch of single sided initial release systems, really did have a legacy impact that lasted for years and did it no favours.


Basically your floppy disk is formatted on each side with a number of tracks, each containing a number of sectors of a given size. [1]

The standard 720KB 3.5' floppy disk as used by the Atari ST used 80 tracks of 9 sectors.

With these tools you could format with, say 11 sectors instead of 9 and boost capacity to 880KB (I think the Amiga was doing that as standard).

[1] https://en.wikipedia.org/wiki/List_of_floppy_disk_formats


On top of that some tools (I used fdformat) shifted the sectors to speed up sequential reads, i.e. have the first sector of the next track rotate in under the r/w head right as the head arrives.

Edit: another unrelated practice was buying up cheap(er) single-density disks, which were distinguished by the lack of a marker hole opposite the r/o protection slider, and used only one side of the medium for data. By drilling it out, one could trick the drives into using both sides, which usually worked out just fine. Mentioned here [1]

[1] https://en.wikipedia.org/wiki/Double-sided_disk


Ah yes, interleaving of the sectors. More prominant with early HD's and using tools like spinwrite to optimise the interleaving. Today it's all 1:1 as processing from the heads just been fast enough for decades to handle the speeds.

Spinwrite still exists, though in the early days/versions it was the golden tool for tuning up a system and could make huge differences. Like double your drive speed and more in some instances, but talking late 80's early 90's here when interleaving was thing and mid 90's 1:1 became the norm and made it moot. https://www.grc.com/Spinrite.htm


Single-density is not the same as single-sided. There are single-sided double-density disks.


I seem to recall having a program called "Twister" that could format disks like this.


There are so many things the ST does badly - weak sound chip, screen memory laid out like a Venetian blind (to quote Jeff Minter I think), and a mouse that was designed in a universe where ergonomics did not exist.

But I still love it. Without the ST I wouldn't have discovered programming and all the highs (and lows) that it brings. Looking back now, I'm surprised people were able to get as much out of it as they did.


Well, I'd argue about the music. The chiptune music from SID and YM has aged very well, while Amiga sounds horrbile without FM synth and 8-bit 11kHz samples.

Also memory was much better organized than Amiga - without the speed penalty.

And the simplicity of Shifter (the "GPU") allowed for really awsome 'beyond the dream' hacks, which were unavaliable on Amiga due to much more capable - but limited in 'hacking' video chip.

And then the first upgrade - Atari STE - amazes today with full control 8-channel 50kHz MODs... While mc68k CPU stayed at 8MHz!


At it again, thorianus! Spreading anti-amiga propaganda, lies, all pure, devilish, unvarnished lies!

I will follow you down the ages, through age extension tech, uploading of consciousness, to the eventual universe spanning one mind!

Always I will appear, always I will prevent your foolish, unjust and untrue claims from spreading unchecked.

Amiga is better ; she is the best. Your Atari smells of milk!! Amigas rule!

(what part of early computing culture did not have inane fan wars?)


Atari STE had 2 channels 8 bit DAC à 25kHz on DMA.

It was the Falcon who had incredible sound capacity with its matrix channel mixer and DSP.


Well, that's the HW. As with video, the 16-bits went on further than just mere hardware limits.

When we talk software mixing, STE maxed out CPU at 8 channels 50kHz, here's an example: http://yerzmyey.i-demo.pl/YERZMYEY-Octopush_ATARI_STe.mp3

Even plain 520 ST can do a lot on this YM chip.

Amiga 500 on the other had could a bit of this but only with 12kHz samples, and no volume control per channel, also those pesky filters.

Amiga 1200/4000 could do same - as per much faster CPU, but... they still left the old audio 8-bit chip in it :/


There's a package EPSS for the STE which allows using its 8 channel DMA as a software synth, at the same time as running other midi channels in Cubase. Makes for a very tidy DAW:

https://youtu.be/OlspnqVcJho

Other packages (DBE tracker?) allow the STE to play 32channel mods albeit not at 50Khz


All Amigas had a 6-bit volume control for each individual sound channel.

All Amigas except the A1000 could turn the high-pass filter off, and pretty much everything tended to do this.


I'd love to own a Falcon, but the prices on eBay make my eyes water.


Star Trek: The Rebel Universe looked far better on an ST than on a PC.

[edit]

Compare:

http://www.atarimania.com/st/screens/star_trek_the_rebel_uni...

https://www.myabandonware.com/media/screenshots/s/star-trek-...


Everything that supported graphics looked better than CGA. Even monochrome Hercules graphics.

Although to be fair CGA wasn't intended to be used on monitors. It was intended to be used on smeary composite video sources where you could expand the palette using artifact colors. Even then the graphics were terrible, but you could at least get a green.


But by '87 EGA cards existed; it's just they never bothered to port it to EGA.


I've never heard of this game before. Looking it up, its impressive considering the capabilities of then current hardware.


Yea, I still have floppy disk and manual for mine. 256K of ram and no hard-drive required.

It definitely required you to have lost many times before you could complete it though; the galaxy was too large for you to explore fully in the time granted by the mechanics.


Well, plain ol' Prince of Persia looked much better on ST - if you compare PC/Amiga ports. Look at the torch animation.


I had a similar experience when going through the (MRI) Ruby source code which contained conditional directives for Atari ST compilation. Support was dropped only as recently as Ruby 2.4! https://github.com/ruby/ruby/blob/c5eb24349a4535948514fe765c...


I love how every comment here is about the Atari ST instead of what the post is about, which is where the 069H byte check actually came from.


Just opens a portal for all of us geeks to uncork our love or hate for this classic machine :-)

The actual substance of the article is neat, but I don't have much to say about it beyond that...


I do too. We have shared history during really interesting and, for most of us, fun times.

Maybe it will always work this way.

Ok fine.


Blame the title.


Fair point — commenting on an article based on its title instead of the contents is a long HN tradition.


The one thing I like the most about the ST (when compared to, say, the Amiga) is its simplicity. It's vastly less capable, of course, but, in the end, the simplicity pays back by allowing easier expansion. The Amiga was a hard machine to evolve, something that cost Commodore a lot.


I got an Amiga and honestly while it was a great machine, a lot of games were written to support the ST as well so didn't use the Amiga's more advanced capabilities. If you were doing video production and graphics work it paid off but frankly I probably would have been just as happy with an ST.

Also the ST gets dinged for being less powerful, but it was actually significantly faster than the Mac at the time and with an add-on could even run Mac applications.


ST and Amiga looked like competition to each other because of the wider market they were situated in but they really had different focus and points of success. The ST was a better DTP and general productivity machine and really was a "rock bottom price" competitor to the Macintosh; its gaming capabilities were nothing compared to the Amiga, but I am not sure that's where the Tramiels really wanted to focus anyways.

I never even had a colour monitor for my ST. The paperwhite monochrome screen was to die for back then, and sure beat the interlace hi-rez experience on the Amiga.

And the ST was a few hundred bucks cheaper. Price per mhz, it was a great machine.


I had a Mac ROM cartridge for the ST, which basically turned it into a Macintosh. And Atari had a 640x400 monochrome monitor that looked just like a Mac - an awesome hack.


Yep long before there were official Mac clones (remember those?!) the ST was an unofficial one in the form of Magic Sac / Spectre GCR. Never had it myself, but what a glorious hack indeed.


I'd love to hear more about your thoughts on this because I'm not sure if I understand what you mean.

My take is that both the Amiga and the Atari had a plethora of expansions and, without having any numbers to show, I think the Amiga won out in the expansion race, from 040 cards for the A500 (AFAIK no 040 was available for any Atari until much later) to the Video Toaster.

Both machines were hard to evolve because both their designs encouraged software that was tightly tied to the hardware. VIDEL and AGA were both desperate and, ultimately, fruitless attempts at having the cake and eating it: sticking to custom chips was needed to keep backwards compatibility, but they made the machines both too expensive and too underpowered to be competitive.


The Amiga was much more closely tied to the hardware than the ST, by dint of it's bus system sharing between the blitter and CPU for chip RAM.

Additionally, having the copper, sprites, scrolling, HAM and planar graphics modes meant that backwards compatibility is harder - just look at the difficulty of emulation for both of them.

The Amiga graphics layout is also (slightly) more difficult to work with, there's a trick for the ST to do quick chunky-to-planar (C2P) conversion, check out the texture mapping in Thunderdome demo (http://www.pouet.net/prod.php?which=64503)

All said, yes the Amiga had more expansion cards for it, but the central bus system was a bottleneck as for the majority of Amigas, you had to use the built-in graphics, if you look at the a1200, the AGA chipset was a poor upgrade over the original chipset, hampered by backwards compatibility and the expense to needing to add a separate (fastram) bank to bypass the system bus for speed.

As far as retargetable graphics goes (ie. VGA-style cards), the ST was ahead as GEM allowed this from the start. That combined with the much simpler design allows a newer machine to be much more powerful as it has to worry about backwards compatibility less.

The Atari Falcon bears this out, a stock Falcon can manage to run Quake2 at 10FPS odd, a stock a1200 even with fastram can't get close. The Falcon was actually developed using an ST with a processor socket, bearing out the simpler architecture could be abused more :)

TBH the Falcon could have been much more, but Atari were broke and cheaped out on the 16bit bus, could've had 24bit VIDEL at 800x600 and run Quake2 at 15-20FPS for £500 in 1992. Add a cdrom with multiTos (effectively unix with a GEM frontend) and you'd have a competitive machine even against the PC of the time.


Well, that was the problem. Atari had the MIDI (pro music) and DTP from the start. The mono 640x400 monitor - ultra sharp was a great gig.

Amiga went the road of being console turned computer, and the expansions only created havok with support. Even A500 Plus had issues.

Sadly - it also affects community - the IP rights for Amiga are mess, the recent issue with Terrible Fire extensions - for some reason a lot of bad blood in a very bold and interesting system made by Atari engineers.


The upgrade path for the Amiga chipset ended up being complicated. The 500 and 600 had the ECS chips, that allowed some video modes that were not tied to NTSC, at the cost of a reduced palette. The chipset also competes with the CPU for memory access and this second generation chipset addressed more memory, making the computer effectively slower ("chip RAM" was slow, "fast RAM" was the RAM outside the reach of the chipset that the CPU has exclusive access to). After ECS came AGA, which pushed the boundary further again (but, at this point, memory constraints were not so terrible).

And, of course, there was what seems like a cocaine-fueled endless sequence of management blunders that drove the company into the ground.


> this second generation chipset addressed more memory, making the computer effectively slower

ECS could address more chipmem but wasn't slower than OCS. You couldn't add fastmem in the trapdoor port, but that didn't matter much since those expansions weren't good enough to impact the speed (trapdoor fastmem was usually called slowfast). The problem was rather one of incompatiblity: some programs written in the 512+512 kbyte era simply assumed they could allocate fastmem, which typically wasn't available on the 500+ and 600.

> After ECS came AGA, which pushed the boundary further again (but, at this point, memory constraints were not so terrible).

AGA, like ECS, could address 2 megs of chipmem. However, it had higher bandwidth and was much faster than ECS.

My point is that even though the ST/e was, as you say, a simpler design in many aspects, Atari still had to equip the Falcon with a YM chip and put support for planar 15 kHz video in VIDEL to maintain backwards compatibility. They faced the same problem as Commodore: their machines were mainly home computers used for games and other software that banged the metal and people expected this to work when upgrading. They also shared a lot of the same problems when upgrading the architecture even slightly, such as with the A3000 and TT030: programs that didn't work with newer versions of TOS/DOS and programs that didn't work with 020/030.

Both platforms are expandable with things like RTG graphics cards, sound cards, CPU cards etc. (in fact I'd argue the Amiga architecture with Zorro, video slots and CPU daughterboards was designed to be vastly more expandable than the Atari) but for most users that didn't matter: if the games they wanted to play didn't work, what point was a 24-bit display that cost more than the computer itself?

Besides, even without keeping backwards compatibility, rolling your own silicon was no longer a viable option financially. Tramiel's vertical integration was a good idea in the 1980:s but the hardware market had shifted. Commodore could've made a triple-A machine but it still wouldn't have been competitive, neither in price nor in performance. The niche markets utilizing the unique features of Atari (MIDI) and Amiga (DTV) weren't large enough and a new architecture that would deprecate all or most existing software (and many peripherals) used by hobbyists would probably only serve to push the home user base towards the PC anyway.


The question is also to consider when and at what price. A Amiga 2000 had indeed quite the extension possibilities, but they were expensive. An Amiga 500 was a whole other story. The expandibility was limited and only in later times was it possible to add a lot of RAM and disks, etc. and RAM it needed a lot, much more than than Atari, but the Ataris didn't need as much memory, but they were also less greedy with it.


Indeed. As a student (1987) I had an Amiga 500 and after a few months I sold it and bought an Atari Mega ST2 in its place. The Amiga was so unpleasant to use for programming. To be confortably usable it required at least 2 floppy drives (a 3rd one would even been useful) or a hard drive, expanding beyond 1 MB was also not cheap and the display was horrible. TV resolution 576i (PAL) is inadequate for editing text. On the Atari we had the cheap SM-124 fantastic monochrome 71Hz refresh rate screen which allowed to stay for hours programming. All compilers and editor I needed would fit in one 800K floppy and the 2 mb of RAM would even allow to work from RAM disk. A breeze. Gaming was still possible with the second cable connected to the TV set (later I installed the PC-Speed emulator in my Mega ST which transformed it in a very capable XT PC (8Mhz V30, 704 kB DOS memory, with 640x400 Olivetti graphics). The Amiga was better for gaming, no contest, for serious stuff like porgramming, word processing (pixel perfect Signum!) and stuff, it was so much better in lower budget.


The Amiga's problem was expensive monitor, flickering screen and... with such great video chip - the default color palette was just abnomination. I know it was made for TV, but that was the problem!


I think the Amiga was much more expandable than Atari. Just check out the variety of accelerator boards, RTG graphics cards, serial boards, HD controllers, etc. available. The Atari OS (TOS) was also very simple compared to Amiga OS.


I think that's oversimplified. TOS was single tasking but two things a) unlike the Amiga a proper 68000 syscall TRAP mechanism was used meaning the OS etc ran using proper supervisor / user separation and b) the application toolkit (AES) built overtop supported message passing and multiple application semantics and c) the graphics subsystem (VDI) was also in theory abstracted away from the physical. In fact the OS components themselves had originally been developed (by Digital Research) for both x86 and 68000 and the original work was done on the Lisa before it ever got put on a Atari hardware. And DR's GEM had a life of its own on PC hardware (see Ventura Publisher, etc.), though handicapped by the Apple lawsuit and competition from Microsoft.

These things meant that later the Atari community and Atari themselves were able to extend the OS in a proper multitasking almost Unix-like direction (MiNT and MultiGEM) and bring it to new hardware, and new display formats and architectures, etc. Provided the applications being run were cleanly written (well, that's a big caveat...). For example -- we can now run TOS/GEM on an Amiga, that's pretty neat (though totally pointless).

In true Tramiel fashion they shipped the cheapest simplest thing they could. But it was something they were able to iterate on -- unfortunately they just did this too slowly. As others have pointed out, the Amiga had amazing hardware from the go, but its architecture became somewhat tied to that original hardware. It was more like a video games console than a workstation. They had a few years headstart on everyone else and then having (then dated) specialized hardware became a liability, not an advantage.


The Atari ST could do the same kind of cooperative multitasking as GEM on a PC.

I built MicroGNUEmacs as a desk accessory so that I could run it at the same time as other programs.


I always found it odd that the GEM environment supported this kind of cooperative multitasking by default only for desk accessories. The API support for doing it more broadly was there, but the desktop and all the applications were written so they would not be. So even when we got proper multitasking support under MiNT there were few/no applications that behaved well in this scenario.

I never saw uemacs compiled as a DA. That would have been a nice trick. The only microemacs I used on my ST was not a windowed application, was console only.


It wasn't a trick to compile mgemacs as a desk accessory, it took a bit of work. I still have the SH204 HD from back then so have the source code, just don't have a ST to read it.

I chose the MicroGNU variant as it did parenthesis matching better than the alternatives. I ran it alongside Franz Lisp that I had also made into a GEM application.


I think AmigaOS felt more sophisticated to an end-user: preemptive multitasking, shared libraries (Atari didn't get these until very late, right?), loadable filesystems, autoconfig / plug-n-play type device drivers, command line / shell. I'm sure TOS could get many of these things with add ons.


Is this a good time to post this video of the Union Demo Copy Program tool[1] for the Atari ST? When it came to serious copy jobs I used Fast Copy[2], though. But I'd heard rumours about formatting disks on DOS.

[1]: https://www.youtube.com/watch?v=cf19uSe2UIA

[2]: https://www.youtube.com/watch?v=Libl3S9AaT8


Gotta capture the floppy drive sound too!

https://www.youtube.com/watch?v=w7aogLweOac


Oh. My. God ... just travelled in time back to my 14 year olds. Thanks.


So the theory is that MSDOS tried to detect if a bootsector contained executable 68k code, because if it did, the BPB was valid and could be used to locate the FAT(s) and the master directory?

That's triply dumb, because of the typo noted in the article, and also because executable Atari ST bootsectors had a checksum that should be computed instead of silly heuristics, but most importantly because most Atari ST disks had no executable bootsector, but the entries concerning disk layout were still valid.

It sounds exactly like the fractal of incompetence Microsoft would implement, and it would explain why we Atarians had to use disks formatted on a PC for data transfer, even though the formats were nominally the same. Funny to read about that, because back in the day, I thought the ST somehow formatted disks "wrong".


The reason they had to be formatted on a PC to be used by both was only due to endian-ness issues:

TOS was tolerant of both big and little endian FAT

While MSDOS tolerated only little-endian.

However the GEM desktop's formatting utility formatted big-endian only.

So to get a floppy readable on both you formatted under MSDOS. Or used a better formatting utility on your Atari that let you choose little endian mode.


Every time we talk of hardware from this era I'm reminded that I was absolutely set on getting a Commodore 64 or even better, a ColecoVision Adam computer, and my dad 'made me' get a Tandy instead. Ugly tan box. Yuck.

My dad and I don't have what you might call compatible decision making processes, so there were many times I was disappointed by his decisions growing up. But that machine taught me DOS, the next one got me onto Windows (answering the question, "How could I possible fill up a 43 Megabyte hard drive?") and those got me my foot in the door at one of the best jobs I ever had.

I'm still a little jealous of all of the Atari and Commodore fans out there, that I didn't get to participate. But if I'd had my way I would probably be worse off and still not be able to participate because I don't think anyone but me has ever mentioned the Adam unless I fished for it. Kids are dumb.


According to the 8 Bit Guy, the Tandy was actually a superb DOS computer, maybe the best :)


The Tandy 1000 line was a great DOS machine of its moment for games, because the 1000 wasn't just a PC clone, it was a better version of the PCjr -- it had better sound and graphics out of the box compared to PCs, with better actual PC software compatibility and without the horrible chiclet keyboard. The 1000 was so successful for a while that games that supported the PCjr's enhancements were marketed as "Tandy-compatible".


Yes, the Tandy 1000. The 8 Bit Guy has an episode devoted to its superiority vs the PCjr.


Which Tandy DOS computer; there were several? While not bad machines, the 80186 ones suffer from all sorts of more or less annoying incompatibilities because of the differences between the peripheral layout and BIOS compatibility. A number of friends of mine got burned by that.


You're right! The Tandy 1000, as the sibling comment mentions.


Question to the Atari brain trust: which ST emulator is the most accurate one? I have an old ST app I wrote, which prompts a weird message about ‘getting original roms’ or some such before it quits when run on Hatari. I found a printout of the source a couple of years ago in some forgotten basement, and there is a check for Xbios(2/3) or similar before this prompt. No clue what this was about, after 35 years of writing it. If anyone has some pointers, that would be great.


Are you using an original TOS image or are you using EmuTOS (which comes with Hatari)? I have run a few old games on Hatari and have found that many of them fail to run on top of EmuTOS but no idea which OS calls are the problematic ones.


> Me supposes that, like with the ‘CALL 5’ mystery, we’ll learn the truth eventually.

This "CALL 5 mystery" sounds interesting, but I couldn't find anything about it. Perhaps somebody who knows what it's about would have more luck?


> (ST refers to Sixteen/Thirty-two, referring to the 68k CPU’s external and internal data width)

I thought ST referred to Sam Tramiel - Jack Tramiel's son.


ST was always "16/32" in conversation with Leonard Tramiel. I don't recall an instance where he admitted it might be anything else. I guess it's possible, but I think that Sam knew it would be unfair to the team.

I wrote this code in the ST BIOS. It's been 35 years, and I don't remember all the details, but I'm pretty sure it was a "Hail Mary" and was not well tested. It appeared to work, and we moved on to more important things. Certainly our really small QA staff (or 5-6 people? certainly less than 10) was not testing it. And we definitely had more important things to make work.

We put the ST together in about ten months; that was from absolutely zero software and no hardware to being available on shelves in stores. I still don't know how we did it. Most of us were working 80-100 hour weeks (and the software people like me were living away from our families, working with Digital Research in Monterey on finishing GEM and porting stuff over).

Every few years I run into Derek Mihockha and he gives me grief about this, and then smiles.


I loved the 1040ST that I bought at the bookstore at Berkeley, which I used throughout college. I had a color screen for games and MIDI, and a monochrome screen that I used for writing papers, programming, and rlogging into the VAXen (4.3 BSD) and Suns. I definitely appreciated the DOS data-transfer trick!


Oh boy, reminds me of the time this Atari fanboy accosted poor Sam in the Palo Alto Pizza My Heart and sputtered my confession of love for the ST at him. He looked at me like I was crazy and told me to leave him the hell alone.


Sixteen/Thirty-two seems to make more sense, especially if you look at the TT, which is Thirty-two/Thirty-two.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: