kerravon 
        
  
  Ligao, Free World North,  07.11.2022, 00:31   | 
     nuclear war (Miscellaneous) | 
    
    
     In a discussion elsewhere (hercules-380), I was told 
that in a nuclear war, it is possible that all 
industrial cities in the world will be nuked, so that 
they don't have a competitive advantage. 
 
And that the only people who will still be able to 
manufacture processors will be universities, and 
they will only be able to do 8-bit computers, not 
16-bit. 
 
So there will be a time delay before new 16-bit 
computers become available. 
 
In addition, the 16-bit computers, when available, 
may or may not go through the same historical 
process, ie segmentation. It can't be ruled out. 
 
So, DOS really may be "ain't dead". 
 
As such, if anyone else has nothing better to do, 
let's standardize 16-bit segmentation computing. 
 
It doesn't necessarily need to be 8086. 
 
And it's probably possible for the same source 
base to be used for future 32-bit programming. 
 
I have made an opening offer/POC already, but it is 
not set in stone. 
 
BFN. Paul.  | 
    
               
             Rugxulo 
        
  
  Usono,  07.11.2022, 09:55                        
  @ kerravon
         | 
     nuclear war | 
    
    
     > And that the only people who will still be able to 
> manufacture processors will be universities, and 
> they will only be able to do 8-bit computers, not 
> 16-bit. 
>  
> So there will be a time delay before new 16-bit 
> computers become available. 
>  
> In addition, the 16-bit computers, when available, 
> may or may not go through the same historical 
> process, ie segmentation. It can't be ruled out. 
>  
> So, DOS really may be "ain't dead". 
>  
> As such, if anyone else has nothing better to do, 
> let's standardize 16-bit segmentation computing. 
>  
> It doesn't necessarily need to be 8086. 
 
Although I think it's unlikely, I really don't hate 8086 and think there are plenty of good compilers for it. 
 
But it's more likely they would reproduce RISC-V, Motorola 68k, or SH2.  | 
     
                
             marcov 
         07.11.2022, 13:28                        
  @ kerravon
         | 
     nuclear war | 
    
    
     > In a discussion elsewhere (hercules-380), I was told 
> that in a nuclear war, it is possible that all 
> industrial cities in the world will be nuked, so that 
> they don't have a competitive advantage. 
 
> And that the only people who will still be able to 
> manufacture processors will be universities, and 
> they will only be able to do 8-bit computers, not 
> 16-bit. 
 
And universities are not based in cities ? Weird kind of philosophy. 
 
Anyway, I think I would take a bike, and bike to ASML. Agreed, that is more than twice as far (30min instead 20) than to the university, but still. 
 
I also don't understand why you think that Universities strictly limit themselves to pre 1985 technology.  
 
I think overall, the bulk of Dos usage was on 32-bit capable hardware.  | 
     
                
             DosWorld 
         07.11.2022, 13:54         (edited by DosWorld, 07.11.2022, 14:49)                
  @ kerravon
         | 
     nuclear war | 
    
    
     > And that the only people who will still be able to 
> manufacture processors will be universities, and 
> they will only be able to do 8-bit computers, not 
> 16-bit. 
 
Here is 2 courses: 
1. How to invent own simple CPU 
2. How to invent (sort of) java for this CPU 
 
https://www.nand2tetris.org/ 
 
Also, Soviet Union had pirate copy of 8080, 8086 and 80286 - seems it must be easy for lo-tech. 
 
PS: Personal me, dwed is hidden into github's arctic code valult. So i am successful reproduce/reload 12ga technology and more worry about unavailable 9x21 (for my sub2000) on our civilian market   then worry about future cpu.  Life is short. --- Make DOS great again! 
Make Russia small again!  | 
     
                
             tkchia 
        
  
  07.11.2022, 15:33                        
  @ kerravon
         | 
     nuclear war | 
    
    
     Hello kerravon, 
 
> As such, if anyone else has nothing better to do, 
> let's standardize 16-bit segmentation computing. 
> It doesn't necessarily need to be 8086. 
> And it's probably possible for the same source 
> base to be used for future 32-bit programming. 
> I have made an opening offer/POC already, but it is 
> not set in stone. 
 
Well, to me the thing is this: If "let's standardize 16-/32-bit computing" is the answer, then what is the question? 
 
I am pretty sure that, in a event of a nuclear war — or for that matter, a large-scale conventional war, or some other large-scale disaster — people who want/need computing power will want it for some concrete, practical purposes.  What will these be? 
 
Thank you! --- https://gitlab.com/tkchia · https://codeberg.org/tkchia · 😴 "MOV AX,0D500H+CMOS_REG_D+NMI"  | 
     
                
             glennmcc 
        
    
  North Jackson, Ohio (USA),  07.11.2022, 20:47                        
  @ kerravon
         | 
     nuclear war | 
    
    
     > In a discussion elsewhere (hercules-380), I was told 
> that in a nuclear war, it is possible that all 
> industrial cities in the world will be nuked, so that 
> they don't have a competitive advantage. 
>  
 
FYI, 
after a nuclear war, computers of any CPU and OS will be 100% useless 
because the entirety of humanity will be thrust back into the stone-age. 
______________________________________________________________________________ 
 
Professor Albert Einstein was asked by friends at a recent dinner party what 
new weapons might be employed in World War III. Appalled at the implications, 
he shook his head. 
 
After several minutes of meditation, he said. "I don't know what weapons might 
be used in World War III. But there isn't any doubt what weapons will be used 
in World War IV." 
 
"And what are those?" a guest asked. 
 
"Stone spears," said Einstein. 
______________________________________________________________________________ --- --  
 http://glennmcc.org/  | 
     
                
             kerravon 
        
  
  Ligao, Free World North,  09.11.2022, 07:35                        
  @ Rugxulo
         | 
     nuclear war | 
    
    
     > > And that the only people who will still be able to 
> > manufacture processors will be universities, and 
> > they will only be able to do 8-bit computers, not 
> > 16-bit. 
> >  
> > So there will be a time delay before new 16-bit 
> > computers become available. 
> >  
> > In addition, the 16-bit computers, when available, 
> > may or may not go through the same historical 
> > process, ie segmentation. It can't be ruled out. 
> >  
> > So, DOS really may be "ain't dead". 
> >  
> > As such, if anyone else has nothing better to do, 
> > let's standardize 16-bit segmentation computing. 
> >  
> > It doesn't necessarily need to be 8086. 
>  
> Although I think it's unlikely, I really don't hate 8086 and think there 
> are plenty of good compilers for it. 
>  
> But it's more likely they would reproduce RISC-V, Motorola 68k, or 
> SH2. 
 
I'm not saying you're wrong. 
 
What I'm saying is that segmentation can't be ruled out. 
 
And it may go via that route for the same reason the 8086 went through that route - to maintain compatibility with an 8-bit CPU that is currently in active use running an OS like CP/M. 
 
In fact, after sorting out the standards for an MSDOS-like OS designed to run on 16:16 (with an eye to 32-bit flat), we should probably standardize on a proposal for 8-bit CPUs, to prepare for the 16:16. 
 
Note that there was a bridge from MSDOS 1.0 to MSDOS 2.0. MSDOS 2.0 introduced a radical new API. 
 
I personally haven't traditionally concerned myself with bridges. 
 
But I do note that when people didn't have proper bridges (68000, OS/2, Itanium), they tended to fail. 
 
As much as people turn their noses up at 8086 segmentation and point to the 68000, that's not what happens in real life. 
 
Note that in real life I predicted that the Amiga was going to replace the PC because it was much better, and I personally made sure my C90-compliant programs ran on both MSDOS and the Amiga (I owned both). 
 
But my personal philosophy, and my personal prediction, turned out to be total flops when it comes to what meatbags do. 
 
Oh, I ran OS/2 2.0 for quite a while too, getting software to run on that.    | 
     
                
             kerravon 
        
  
  Ligao, Free World North,  09.11.2022, 07:43                        
  @ marcov
         | 
     nuclear war | 
    
    
     > > In a discussion elsewhere (hercules-380), I was told 
> > that in a nuclear war, it is possible that all 
> > industrial cities in the world will be nuked, so that 
> > they don't have a competitive advantage. 
>  
> > And that the only people who will still be able to 
> > manufacture processors will be universities, and 
> > they will only be able to do 8-bit computers, not 
> > 16-bit. 
>  
> And universities are not based in cities ? Weird kind of philosophy. 
 
My understanding is that there are universities in cities that don't have an industrial base that is subject to nuking. I'm currently in (rural) Philippines, and according to what I was told in hercules-380, the two Philippines cities subject to nuking are only Manila and Quezon. My provincial capital is Legazpi, and there are definitely universities there, because my in-laws go to universities in this province. Although I don't know if any of those can fabricate chips. Maybe we have to link up with Taiwanese universities - I have no idea. 
 
So according to that theory of nuclear war, no-one is going to spend effort nuking every single city in the Philippines. Just the two that have some industrial capacity. 
 
> Anyway, I think I would take a bike, and bike to ASML. Agreed, that is more 
> than twice as far (30min instead 20) than to the university, but still. 
>  
> I also don't understand why you think that Universities strictly limit 
> themselves to pre 1985 technology.  
 
I'm talking about fabricating new chips. I was told (and I can get you the link if you want, and ask for clarification), that they are the only people who can fabricate new CPUs, and they can only do very basic 8-bit CPUs. 
 
I have no idea (and I don't think anyone else knows with any confidence either), how long it will take to get up to 16-bit CPUs. 
 
> I think overall, the bulk of Dos usage was on 32-bit capable hardware. 
 
That could be a long time coming. Or maybe it will be a short time - no-one knows for sure. 
 
My question is - if it is a long time, and 16-bit segmented architecture ends up being a thing, yet again, what do you suggest? 
 
BFN. Paul.  | 
     
                
             kerravon 
        
  
  Ligao, Free World North,  09.11.2022, 07:50                        
  @ tkchia
         | 
     nuclear war | 
    
    
     > Hello kerravon, 
>  
> > As such, if anyone else has nothing better to do, 
> > let's standardize 16-bit segmentation computing. 
> > It doesn't necessarily need to be 8086. 
> > And it's probably possible for the same source 
> > base to be used for future 32-bit programming. 
> > I have made an opening offer/POC already, but it is 
> > not set in stone. 
>  
> Well, to me the thing is this: If "let's standardize 16-/32-bit computing" 
> is the answer, then what is the question? 
>  
> I am pretty sure that, in a event of a nuclear war — or for that matter, 
> a large-scale conventional war, or some other large-scale disaster — 
> people who want/need computing power will want it for some concrete, 
> practical purposes.  What will these be? 
 
I asked that exact question, and here is the answer: 
 
https://groups.io/g/hercules-380/message/1098 
 
I have my own answer too - I don't really care what computers are used for. I know that early computers with very little memory were used for designing aircraft, which apparently requires lots of calculations to be done. 
 
Maybe no-one will be interested in aeroplanes this time around - maybe we already have designs - I have no idea - I'm not an aeroplane expert. 
 
I'm pretty sure someone will have some application for computers, no matter how little memory available or how slow the CPU is, or even if it is made of valves. 
 
Of course you will still be able to have cage matches for the surviving computers. But as they fail, and can't be replaced, and the only new computers available for sale are 8-bit ones, you may need to make tough choices. Not everyone is capable of winning a cage match. That's my target market. 
 
But I'm thinking ahead a bit - to 16-bit. I'll go back to 8-bit after I've sorted out 16-bit to my satisfaction. 
 
BFN. Paul.  | 
     
                
             kerravon 
        
  
  Ligao, Free World North,  09.11.2022, 07:58                        
  @ glennmcc
         | 
     nuclear war | 
    
    
     > > In a discussion elsewhere (hercules-380), I was told 
> > that in a nuclear war, it is possible that all 
> > industrial cities in the world will be nuked, so that 
> > they don't have a competitive advantage. 
> >  
>  
> FYI, 
> after a nuclear war, computers of any CPU and OS will be 100% useless 
> because the entirety of humanity will be thrust back into the stone-age. 
 
I don't think that is correct. 
 
There will still be surviving computers after a nuclear war, and it won't be the stone age, it will be an interesting environment. 
 
We already know computers are possible, and a lot of concepts are already known. We just need to rebuild the manufacturing bases, without any large cities of any industrial value. 
 
Or let me put it another way. 
 
Yes, it is possible that nukes somehow take out all the people who know anything about computers, and we are literally back at the stone age. I don't want to say you are wrong. 
 
But - IF - there are still surviving computer programmers, and maybe other people with technical know-how, e.g. university professors in (random city not nuked), THEN what can we do? 
 
Or yet another way - what needs to survive a nuclear war in order to get the recovery process started? If all I need to do is print out a few pages on Wikipedia before the internet disappears, maybe I should do that while it still exists. 
 
Or at you 100% sure that there is 0% chance of anything at all surviving except stone spears? 
 
As a computer programmer, I've learnt to not even be sure that if (1 != 0) always returns true. 
 
And ironically, that really happened to me, during PDOS development, because interrupts were happening and I wasn't preserving the flags properly in the interrupt, and expressions like that semi-randomly returned the incorrect results.    | 
     
                
             glennmcc 
        
    
  North Jackson, Ohio (USA),  09.11.2022, 16:06         (edited by glennmcc, 09.11.2022, 16:25)                
  @ kerravon
         | 
     nuclear war | 
    
    
     > > FYI, 
> > after a nuclear war, computers of any CPU and OS will be 100% useless 
> > because the entirety of humanity will be thrust back into the stone-age. 
>  
> I don't think that is correct. 
>  
> There will still be surviving computers after a nuclear war, and it won't 
> be the stone age, it will be an interesting environment. 
>  
 
Personally, I'll take Einstein's word for it. 
 
 
 
But, be that as it may, 
we might as-well debate back-n-forth as to what would happen 
after a blackhole has swallowed up our sun. 
 
Therefore... http://glennmcc.org/download/never_mind.web 
 
  --- --  
 http://glennmcc.org/  | 
     
                
             tkchia 
        
  
  09.11.2022, 17:55                        
  @ kerravon
         | 
     nuclear war | 
    
    
     Hello kerravon, 
 
> > Well, to me the thing is this: If "let's standardize 16-/32-bit 
> computing" 
> > is the answer, then what is the question? 
> > I am pretty sure that, in a event of a nuclear war — or for that 
> matter, 
> > a large-scale conventional war, or some other large-scale disaster — 
> > people who want/need computing power will want it for some concrete, 
> > practical purposes.  What will these be? 
 
> I have my own answer too - I don't really care what computers are used for. 
> I know that early computers with very little memory were used for designing 
> aircraft, which apparently requires lots of calculations to be done. 
 
But how do you get from "we might want to do lots of calculations to design aircraft" to "let's standardize 16-/32-bit computing"?  How exactly does this "standardization" help anything at all? 
 
Standardization might be useful, methinks, in times of peace when people are eating tofu (to borrow a turn of phrase).  In times of war or nuclear disaster, not so much. 
 
Thank you! --- https://gitlab.com/tkchia · https://codeberg.org/tkchia · 😴 "MOV AX,0D500H+CMOS_REG_D+NMI"  | 
     
                
             kerravon 
        
  
  Ligao, Free World North,  09.11.2022, 17:58                        
  @ glennmcc
         | 
     nuclear war | 
    
    
     > > > FYI, 
> > > after a nuclear war, computers of any CPU and OS will be 100% useless 
> > > because the entirety of humanity will be thrust back into the 
> stone-age. 
> >  
> > I don't think that is correct. 
> >  
> > There will still be surviving computers after a nuclear war, and it 
> won't 
> > be the stone age, it will be an interesting environment. 
> >  
>  
> Personally, I'll take Einstein's word for it. 
 
Appeal to authority doesn't wash with me, and 
Einstein didn't get everything right anyway. 
 
And post-nuclear war, when someone shoots you 
in the head with a perfectly working gun because 
there weren't enough nukes to take out every 
single gun in the planet, just remember that 
some guy on "Dos Ain't Dead" said "told you so". 
 
And you can add to that the fact that depending 
on how you count, we've already had World War 3 
(2 hot, 1 cold), won in our favor already. And 
I count the "War on Terror" as World War 4 too. 
Yet another ideological war (the same as 3). 
But to actually beat "terror" requires a 
comprehensive war covering a ridiculous number 
of ideologies and even ideas, and at an 
individual level, not just a leadership level. 
 
We've been fighting WW4 since before 9/11, but 
9/11 forced the issue. 
 
It is still unknown whether anyone will use 
nukes during the ongoing WW4 conflict. 
 
I believe there was a similar anomaly in WW1 - 
there were new dreadnought ships available but 
neither side was willing to deploy them to find 
out if theirs were inferior. 
 
BFN. Paul.  | 
     
                
             kerravon 
        
  
  Ligao, Free World North,  09.11.2022, 18:15                        
  @ tkchia
         | 
     nuclear war | 
    
    
     > Hello kerravon, 
>  
> > > Well, to me the thing is this: If "let's standardize 16-/32-bit 
> > computing" 
> > > is the answer, then what is the question? 
> > > I am pretty sure that, in a event of a nuclear war — or for that 
> > matter, 
> > > a large-scale conventional war, or some other large-scale disaster — 
> > > people who want/need computing power will want it for some 
> concrete, 
> > > practical purposes.  What will these be? 
>  
> > I have my own answer too - I don't really care what computers are used 
> for. 
> > I know that early computers with very little memory were used for 
> designing 
> > aircraft, which apparently requires lots of calculations to be done. 
>  
> But how do you get from "we might want to do lots of calculations to design 
> aircraft" to "let's standardize 16-/32-bit computing"?  How exactly does 
> this "standardization" help anything at all? 
 
I like to code to a standard for my own code. 
 
Normally that is C90, and I have gone an awful 
long way with just C90. 
 
But at the end of the day, I need to be able 
to do a "dir", and that involves, for me, 
at least currently, using PosGetDTA, PosFindFirst 
and PosFindNext. 
 
And although C90 can hide lots of things from a 
programmer, as a C90 library author myself, I 
don't get hidden from that, so even though the 
need for "dir" doesn't exist in C90, I do need 
PosOpenFile. 
 
Maybe I'm the only person in the world who wants 
a standard API, but I'd be surprised if that was 
the case. Why did people come up with POSIX if 
no-one needs a standard OS interface? 
 
I can't use POSIX myself, because that is full of 
crap like fork() which is not suitable for a low 
end machine like MSDOS ran on. 
 
> Standardization might be useful, methinks, in times of peace when people 
> are eating tofu 
> (to 
> borrow a turn of phrase).  In times of war or nuclear disaster, not 
> so much. 
 
We're currently at what counts as "peace". 
 
And I'm interested for historical reasons anyway. 
We were at nominal "peace" in the 1980s which is 
when this should have been done. The ARM CPU was 
in fact available in 1985. The computers that 
used it should have been running an MSDOS clone 
that allowed source mode compatibility due to the 
standardized API. 
 
For whatever reason that wasn't done already in 
say 1981, in preparation for the possibility of 
the ARM, or 68000, but I don't particularly care 
why it wasn't already done, I just want to do it 
belatedly. 
 
I was programming in C in about 1987, and I coded 
to the ANSI C draft, which meant I couldn't do 
things like directory traversal. 
 
There were other things I realize I should have 
been able to do as well, like use ANSI output 
and have the OS have an option to bypass the 
BIOS so that it was fast, instead of every 
single fullscreen program doing exactly that itself. 
 
And get ANSI keyboard strokes, not just ANSI output. 
 
I'm basically trying to reconcile the problems I 
had when starting in 1987. 
 
BFN. Paul.  | 
     
                
             tkchia 
        
  
  09.11.2022, 18:35                        
  @ kerravon
         | 
     nuclear war | 
    
    
     Hello kerravon, 
 
> > But how do you get from "we might want to do lots of calculations to 
> design 
> > aircraft" to "let's standardize 16-/32-bit computing"?  How exactly does 
> > this "standardization" help anything at all? 
> I like to code to a standard for my own code. 
 
Then you are not solving any actual problem — you are just describing a problem in terms of your solution. 
 
Sorry, I still fail to see how you get from "we might want to do lots of calculations to design aircraft — in case of a nuclear war" to "let's standardize 16-/32-bit computing". 
 
Thank you! --- https://gitlab.com/tkchia · https://codeberg.org/tkchia · 😴 "MOV AX,0D500H+CMOS_REG_D+NMI"  | 
     
                
             marcov 
         09.11.2022, 20:18                        
  @ kerravon
         | 
     nuclear war | 
    
    
     > My question is - if it is a long time, and 16-bit segmented architecture 
> ends up being a thing, yet again, what do you suggest? 
 
Do what i already do daily now. Keep on programming Microchip dspic33<x>    It is a segmented 16-bit Harvard architecture. 
 
Anyway, the whole scenario is so absurd and with so many variables, that an answer to a 16-bit x86 only world would be likewise absurd. 
 
Either some production capacity is saved, or Einstein was right and the WW IV will be fought with sticks and stones. 
 
But suddenly an architecture that is convoluted and not in active production is resurrected again ? Nonsense. 
 
More likely the ability to bring new designs in production is damaged, and they can only keep the production setup they have now. ..... non of which are 16-bit only x86.  | 
     
                
             kerravon 
        
  
  Ligao, Free World North,  09.11.2022, 23:10                        
  @ tkchia
         | 
     nuclear war | 
    
    
     > Hello kerravon, 
>  
> > > But how do you get from "we might want to do lots of calculations to 
> > design 
> > > aircraft" to "let's standardize 16-/32-bit computing"?  How exactly 
> does 
> > > this "standardization" help anything at all? 
> > I like to code to a standard for my own code. 
>  
> Then you are not solving any actual problem — you are just describing a 
> problem in terms of your solution. 
 
The problem is that POSIX exists for a reason, but 
there is no equivalent for small systems, like we 
had in the 1980s, and we may have again. 
 
I am still programming for that era. 
 
And that era may return. 
 
> Sorry, I still fail to see how you get from "we might want to do lots of 
> calculations to design aircraft — in case of a nuclear war" 
 
It's after nuclear war. I'm surprised you think there 
will be no use for computers after nuclear war. 
 
> to "let's 
> standardize 16-/32-bit computing". 
 
What's wrong with coding to a standard so that you 
can use computers from multiple different vendors? 
 
BFN. Paul.  | 
     
                
             kerravon 
        
  
  Ligao, Free World North,  09.11.2022, 23:25                        
  @ marcov
         | 
     nuclear war | 
    
    
     > > My question is - if it is a long time, and 16-bit segmented architecture 
> > ends up being a thing, yet again, what do you suggest? 
>  
> Do what i already do daily now. Keep on programming Microchip dspic33<x> 
>    It is a segmented 16-bit Harvard architecture. 
>  
> Anyway, the whole scenario is so absurd and with so many variables, that an 
> answer to a 16-bit x86 only world would be likewise absurd. 
 
It may not be x86, it may be 16-bit segmented with 
a different instruction set. 
 
> Either some production capacity is saved, 
 
I'm specifically talking about the situation where 
no production capacity is saved, it was deliberately 
wiped out to prevent any country having a competitive 
advantage in recovery. 
 
> or Einstein was right and the WW 
> IV will be fought with sticks and stones. 
 
I don't know how you can possibly predict the future 
with such accuracy that those are the only 2 
possibilities. 
 
Regardless, even if your crystal ball is so accurate, 
I would then like to answer a hypothetical question. 
 
What is an appropriate standard for a world, unlike 
the guaranteed real world with only 2 choices, where 
16:16 segmentation becomes a thing again. 
 
> But suddenly an architecture that is convoluted and not in active 
> production is resurrected again ? Nonsense. 
 
It won't immediately be resurrected. Like I said, when 
only universities (outside industrial cities) have the 
ability to fabricate chips, and they are only capable 
of fabricating 8-bit CPUs, it is 8-bit that will be 
active production. 
 
I don't know how long it will take to reach 16:16. 
 
I just want to be ready for when it does. Even if 
that is 5000 years. 
 
I want to have the standards documented and code 
written to that standard for people 5000 years from 
now. I'll probably try to get my code punched on to 
plastic cards to be machine-readable as well as on 
commercially-produced CDROM and paper printout, and 
a bare minimum guide on some piece of metal or 
something like that. 
 
> More likely the ability to bring new designs in production is damaged, and 
> they can only keep the production setup they have now. ..... non of which 
> are 16-bit only x86. 
 
I'm not making any claim on what is "more likely". 
 
Even if it is "less likely", or even if you can 
guarantee it is non-existent, I'm interested in 
the scenario where there is 16-bit x86 or even 
16-bit some other instruction set (because you 
don't know, that's why you need C90). 
 
Or if you like, the same question another way - 
what would have been a good standard to have in 
the 1980s, to complement C90, to provide people 
with more options than just MSDOS and 8086? 
 
And specifically, what needed to exist so that the 
Amiga would be a viable replacement for businesses 
struggling with a 640k memory limit? 
 
It may not be just language standardization, but 
also build mechanisms that needed to exist, or 
even culture changes. 
 
But language standardization should be one of those 
things, which would have at least made sure that 
perfectly valid Amiga code was sitting there, and 
just needed to be recompiled for the 68000 and 
voila, decent business software. 
 
BFN. Paul.  | 
     
                
             tkchia 
        
  
  10.11.2022, 00:07                        
  @ kerravon
         | 
     nuclear war | 
    
    
     Hello kerravon, 
 
> The problem is that POSIX exists for a reason, but 
> there is no equivalent for small systems, like we 
> had in the 1980s, and we may have again. 
 
Have you ever coded before for an Intel 8O8O-based system, or some other system with quite literally less than 64 KiB of RAM?  I have. 
 
With all due respect, methinks you know not whereof you speak. 
 
Thank you! --- https://gitlab.com/tkchia · https://codeberg.org/tkchia · 😴 "MOV AX,0D500H+CMOS_REG_D+NMI"  | 
     
                
             tkchia 
        
  
  10.11.2022, 00:25                        
  @ marcov
         | 
     nuclear war | 
    
    
     Hello marcov, 
 
> Do what i already do daily now. Keep on programming Microchip dspic33<x> 
>    It is a segmented 16-bit Harvard architecture. 
> Anyway, the whole scenario is so absurd and with so many variables, that an 
> answer to a 16-bit x86 only world would be likewise absurd. 
 
Well, to borrow a phrase from Jamie Zawinski: 
 
(1) you see a programming problem M involving a dsPIC33 chip (or some such), 
 
(2) and you think "I know, I'll write a self-hosted C compiler for the chip that runs on the same chip itself, so that I can solve my original problem M". 
 
(3) Now you have two problems.    
 
Thank you! --- https://gitlab.com/tkchia · https://codeberg.org/tkchia · 😴 "MOV AX,0D500H+CMOS_REG_D+NMI"  | 
     
                
             kerravon 
        
  
  Ligao, Free World North,  10.11.2022, 01:16                        
  @ tkchia
         | 
     nuclear war | 
    
    
     > Hello kerravon, 
>  
> > The problem is that POSIX exists for a reason, but 
> > there is no equivalent for small systems, like we 
> > had in the 1980s, and we may have again. 
>  
> Have you ever coded before for an Intel 8O8O-based system, or some other 
> system with quite literally less than 64 KiB of RAM?  I have. 
 
I started programming in assembly with the 
Commodore 64 in 1984. Not a proper assembler - 
the equivalent of MSDOS "debug". 
 
> With all due respect, methinks you know not whereof you speak. 
 
Sorry, I have somehow failed to be clear. 
 
Although post-nuclear war it is possible that only 
8-bit computers can be manufactured, I'm not asking 
about an API for them. That's something I'll think 
about later. 
 
What I'm interested in is what to do if/when technology 
reaches a 16:16 stage. 
 
And not necessarily a 4-bit segment shift. It could be 
a 5-bit segment shift which would give access to 2 MB 
memory which is more practical for PDOS/86. 
 
Note that the 80286, while it doesn't give a 5-bit 
segment shift, can effectively allow programs to use 
2 MB or more memory and not be aware that they are 
not running on an 8086. 
 
I should also point out that creating an API for small 
(16:16) systems is not technically impossible. I have 
already made an opening offer here: 
 
https://sourceforge.net/p/pdos/gitcode/ci/master/tree/src/pos.c 
 
Basically, if ISO had got together in the 1980s, and 
had that already written, and realized that that they 
should support more than the 8086, such as the 68000, 
still with no virtual memory so Unix is not an option, 
what would ISO come up with? 
 
Also assuming that ISO had standardized C90 in 1980 
instead of waiting for it to become popular. 
 
And ISO could have standardized a replacement for pos.c 
in 1980 as well. There was no reason they needed to wait 
for an actual OS for the 8086. Or even an actual 8086. 
The concept of 16:16 and flat 32 exists independently of 
Intel and Motorola. 
 
BFN. Paul.  | 
     
                
             marcov 
         10.11.2022, 11:19                        
  @ kerravon
         | 
     nuclear war | 
    
    
     Well, the only thing I can say is that universities would simply make a linear 24 or 32-bit address space or use some other better addressing scheme to access 16+ quantities (e.g. by having wider addressing registers) 
 
Keep in mind that the 16-bit x86 segment model is mostly due to legacy with CP/M, something that wouldn't matter after WW-III.  
 
p.s. the dspic is a true 16-bit mpu. Most only have 28 or 56k, and those are already the more high ends.  Recently a new CK breaks the 64k barrier, but that is a recent requirement. Self hosted compilers are therefore unlikely.  | 
     
                
             marcov 
         10.11.2022, 11:28                        
  @ kerravon
         | 
     nuclear war | 
    
    
     > Basically, if ISO had got together in the 1980s, 
 
Well, strictly it was, since it was approved by Ansi in 1989, and the ISO certification in 1990 was a form of rubber stamping for this side of the pond.  | 
     
                
             kerravon 
        
  
  Ligao, Free World North,  10.11.2022, 13:33                        
  @ marcov
         | 
     nuclear war | 
    
    
     > Well, the only thing I can say is that universities would simply make a 
> linear 24 or 32-bit address space or use some other better addressing 
> scheme to access 16+ quantities (e.g. by having wider addressing 
> registers) 
 
Universities won't be doing anything other than 
producing 8-bit CPUs. It will require industry 
to be formed to produce 16-bit CPUs. 
 
> Keep in mind that the 16-bit x86 segment model is mostly due to legacy with 
> CP/M, something that wouldn't matter after WW-III.  
 
You don't know that. We don't know how long we 
will remain on 8-bit CPUs for until industry 
gets reestablished. It could be 100 years. If 
the only CPUs for the last 100 years were 8-bit, 
not necessarily 8080, what OS do you expect to 
be run on them if not something like CP/M? 
 
Whatever your answer is, that's the new legacy. 
 
So when 16:16 finally arrives on the scene, people 
will likely want to be able to run legacy code, 
for the same reason they did up to now. 
 
> p.s. the dspic is a true 16-bit mpu. Most only have 28 or 56k, and those 
> are already the more high ends.  Recently a new CK breaks the 64k barrier, 
> but that is a recent requirement. Self hosted compilers are therefore 
> unlikely. 
 
SubC runs in small memory model, ie less than 128k. 
At least until quite recently when I recompiled in 
large memory model after more functionality was 
added. 
 
Self-hosted C compilers existed on 64k machines 
already, they were just written with multiple 
phases instead of self-contained like SubC. 
 
That's my understanding, anyway. 
 
>> Basically, if ISO had got together in the 1980s, 
 
> Well, strictly it was, since it was approved by 
> Ansi in 1989, and the ISO certification in 1990 
> was a form of rubber stamping for this side of the pond. 
 
Ok, sure. What I meant was if ANSI had gotten their 
act together a bit earlier so that C90/C89 was more 
like C80 or C83, something like that. 
 
BFN. Paul.  | 
     
                
             tom 
        
  
  Germany (West),  10.11.2022, 14:50                        
  @ kerravon
         | 
     nuclear war | 
    
    
     your scenario (we have to use 8/16 bit computers) is very unlikely to happen because: 
 
people who have been killed by the nuclear blast no longer need computers. 
 
all other people have one or more computers, both at home and in the office with no need to replace them soon at all; in particular not by obscure 16 bit computers that nobody can program. 
 
they probably can even use the computers of their dead colleagues as computers are probably less susceptible to nuclear blast/fallout than people. 
 
however, it could be problematic to find electricity to run your computer. maybe only compute when the sun shines or the wind blows   | 
     
                
             kerravon 
        
  
  Ligao, Free World North,  10.11.2022, 15:13                        
  @ tom
         | 
     nuclear war | 
    
    
     > your scenario (we have to use 8/16 bit computers) is very unlikely to 
> happen because: 
>  
> people who have been killed by the nuclear blast no longer need computers. 
>  
> all other people have one or more computers, both at home and in the office 
> with no need to replace them soon at all; in particular not by obscure 16 
> bit computers that nobody can program. 
 
People buy new computers for a reason. As computers 
fail, anyone who wants a new one, will only have 
one choice - 8-bit CPU. Or to shoot their neighbor 
and take his. 
 
As time goes by, there will be no more neighbors to 
shoot. 
 
Anyhow, again, if you're 100% certain that there's 
an infinite supply of neighbors to shoot, that's 
fine, I'm not saying you're wrong. 
 
But, as a hypothetical, if you are wrong, then can 
we have a discussion of the 8 to 16 to 32 migration 
path? 
 
> however, it could be problematic to find electricity to run your computer. 
> maybe only compute when the sun shines or the wind blows  
 
Yes, in the hercules-380 group I discussed solar power 
as well, and based on their recommendation bought some 
to try. It's tough to power even just my smartphone. 
 
Luckily my smartphone now runs the beginning of MSDOS, 
natively. The main thing I'm missing is sign-off on 
the standards. 
 
BFN. Paul.  | 
     
                
             kerravon 
        
  
  Ligao, Free World North,  10.11.2022, 15:27                        
  @ tom
         | 
     nuclear war | 
    
    
     > in particular not by obscure 16 
> bit computers that nobody can program. 
 
What do you mean by this? Why can't people 
program 16 bit computers? 
 
Thanks. Paul.  | 
     
                
             tom 
        
  
  Germany (West),  10.11.2022, 15:55                        
  @ kerravon
         | 
     nuclear war | 
    
    
     > People buy new computers for a reason. 
 
right. like running the most recent game in highest resolution, or running the most recent windows. 
 
I don't see how an 8-bit CPU could ever fulfill such a reason.  | 
     
                
             tom 
        
  
  Germany (West),  10.11.2022, 15:58                        
  @ kerravon
         | 
     nuclear war | 
    
    
     > > in particular not by obscure 16 
> > bit computers that nobody can program. 
>  
> What do you mean by this? Why can't people 
> program 16 bit computers? 
 
I'm aware you missed this, but programmers today (mostly) learn to code in python, Perl, or javascr***. these languages aren't this widespread in 16-bit land.  | 
     
                
             kerravon 
        
  
  Ligao, Free World North,  10.11.2022, 16:22                        
  @ tom
         | 
     nuclear war | 
    
    
     >> People buy new computers for a reason. 
 
> right. like running the most recent game in 
> highest resolution, or running the most recent windows. 
 
Or their old computer breaking. 
 
I bought 4 old desktops recently, and all were working, 
and running Windows 7. 
 
One of them now no longer powers on, and another 
doesn't boot from hard disk anymore, but I can 
still boot PDOS from USB stick. 
 
When the next 2 fail, what do you propose I replace 
them with when the production lines have become 
glass, but universities are still able to produce 
some, and/or when industry starts to ramp up again? 
 
Shooting the neighbors is like socialism - eventually 
you run out of other people's money. 
 
> I don't see how an 8-bit CPU could ever fulfill such a reason. 
 
I happily used a Commodore 64 for years, and 
I wouldn't say that I ever completely mastered it. 
 
Other people managed to get a C compiler working 
on it, I never did that, as one example. 
 
> > > in particular not by obscure 16 
> > > bit computers that nobody can program. 
> >  
> > What do you mean by this? Why can't people 
> > program 16 bit computers? 
>  
> I'm aware you missed this, but programmers today (mostly) learn to code in 
> python, Perl, or javascr***. these languages aren't this widespread in 
> 16-bit land. 
 
And what's preventing them from picking up C, 
or the other languages available on the 8086? 
 
Are you suggesting that programmers have 
devolved to a point where they can't learn 
new languages? Even if some of them have, 
it's surely not all of them. 
 
BFN. Paul.  | 
     
                
             glennmcc 
        
    
  North Jackson, Ohio (USA),  10.11.2022, 16:55                        
  @ kerravon
         | 
     nuclear war | 
    
    
     > > > > FYI, 
> > > > after a nuclear war, computers of any CPU and OS will be 100% 
> useless 
> > > > because the entirety of humanity will be thrust back into the 
> > stone-age. 
> > >  
> > > I don't think that is correct. 
> > >  
> > > There will still be surviving computers after a nuclear war, and it 
> > won't 
> > > be the stone age, it will be an interesting environment. 
> > >  
> >  
> > Personally, I'll take Einstein's word for it. 
>  
> Appeal to authority doesn't wash with me, and 
> Einstein didn't get everything right anyway. 
>  
 
I have one last question for you on this totally ridiculous and totally absurd subject. 
 
Since you seem to feel that you are a better "authority" than was Einstein... 
 
What is your assessment as to the capabilities of computers and OSs 
available to us after a blackhole has swallowed-up our sun ? 
 
Will we have 16bit machines booted to DOS ? 
 
Of will we then need to resort to using machines like this one ? 
https://images.computerhistory.org/revonline/images/102646242-05-01.jpg?w=600 
 
Of perhaps _this_ will be our only available computer ?    
https://en.wikipedia.org/wiki/Abacus#/media/File:RomanAbacusRecon.jpg --- --  
 http://glennmcc.org/  | 
     
                
             kerravon 
        
  
  Ligao, Free World North,  10.11.2022, 17:18                        
  @ glennmcc
         | 
     nuclear war | 
    
    
     > > > > > FYI, 
> > > > > after a nuclear war, computers of any CPU and OS will be 100% 
> > useless 
> > > > > because the entirety of humanity will be thrust back into the 
> > > stone-age. 
> > > >  
> > > > I don't think that is correct. 
> > > >  
> > > > There will still be surviving computers after a nuclear war, and it 
> > > won't 
> > > > be the stone age, it will be an interesting environment. 
> > > >  
> > >  
> > > Personally, I'll take Einstein's word for it. 
> >  
> > Appeal to authority doesn't wash with me, and 
> > Einstein didn't get everything right anyway. 
>  
> I have one last question for you on this totally ridiculous and totally 
> absurd subject. 
>  
> Since you seem to feel that you are a better "authority" than was 
> Einstein... 
 
I didn't make such a claim. 
 
> What is your assessment as to the capabilities of computers and OSs 
> available to us after a blackhole has swallowed-up our sun ? 
 
I don't have an opinion on that. 
 
> Will we have 16bit machines booted to DOS ? 
 
I have no idea about that, but while ever the 
possibility exists that someone may wish to 
manufacture 16-bit computers that boot to DOS, 
I would like to have standards for such 
computers organized now. 
 
BFN. Paul.  | 
     
                
             tkchia 
        
  
  10.11.2022, 17:51         (edited by tkchia, 10.11.2022, 18:41)                
  @ kerravon
         | 
     nuclear war | 
    
    
     Hello kerravon, 
 
> I started programming in assembly with the 
> Commodore 64 in 1984. Not a proper assembler - 
> the equivalent of MSDOS "debug". 
 
OK — now try to get a self-hosted C compiler working on that. 
 
> > With all due respect, methinks you know not whereof you speak. 
... 
> What I'm interested in is what to do if/when technology 
> reaches a 16:16 stage. 
 
Well, yes, if we end up in some universe where your idea might make sense, then ... your idea might actually make sense.  Yes, yes, if you put it that way, Mr. Captain Obvious Tautology. 
 
The real question here is why you think there might be a snail's chance that your idea may make any sense.  Because, you know, disasters have a way of not going according to our expectations or wishes.  And that is partly what makes them disasters. 
 
And nowhere do you explain 
- why this "standardization" is so important in a post-nuclear world 
- why your proposed standards are any good 
- or why existing standards or existing practices somehow fall short. 
 
On a somewhat unrelated tangent: 
 
> And you can add to that the fact that depending 
> on how you count, we've already had World War 3 
> (2 hot, 1 cold), won in our favor already. And 
 
Well, that certainly looks like a view of history straight out of the "Project for the New American Empire Century".  It is easy to wax lyrical about how "we" are on the "winning" side — whatever that means — if "we" do not end up as collateral damage.  And there was much needless collateral damage during the Cold War.  But I digress. 
 
Thank you! --- https://gitlab.com/tkchia · https://codeberg.org/tkchia · 😴 "MOV AX,0D500H+CMOS_REG_D+NMI"  | 
     
                
             tkchia 
        
  
  10.11.2022, 18:06         (edited by tkchia, 10.11.2022, 18:41)                
  @ kerravon
         | 
     nuclear war | 
    
    
     Hello kerravon, 
 
> I should also point out that creating an API for small 
> (16:16) systems is not technically impossible. I have 
> already made an opening offer here: 
> https://sourceforge.net/p/pdos/gitcode/ci/master/tree/src/pos.c 
 
Well... what is the niche your proposed "standard" is supposed to fill? 
 
There is definitely already some sort of de facto common API that was implemented across the major compilers targeting MS-DOS — including Open Watcom, Microsoft C, and later versions of Borland C++. 
 
(Edit: and Digital Mars.) 
 
So what exactly does your new proposed standard offer? 
 
Thank you! --- https://gitlab.com/tkchia · https://codeberg.org/tkchia · 😴 "MOV AX,0D500H+CMOS_REG_D+NMI"  | 
     
                
             DosWorld 
         10.11.2022, 19:43         (edited by DosWorld, 10.11.2022, 19:59)                
  @ kerravon
         | 
     nuclear war | 
    
    
     > And that the only people who will still be able to 
> manufacture processors will be universities, and 
> they will only be able to do 8-bit computers, not 
> 16-bit. 
 
Computers - not need. Just use your imagination. 
https://www.amazon.com/-/dp/0998379417/ 
 
I hope, author of this book will write the same book, but about Word and Excel. 
 
PS:   
 
PPS: Seriously, why should humanity follow the path of technocracy again? They may choose a completely different way. For example, they can see 8086 memory model, 68 genders, get horrified and say "never again!", and become druids. --- Make DOS great again! 
Make Russia small again!  | 
     
                
             glennmcc 
        
    
  North Jackson, Ohio (USA),  10.11.2022, 20:58                        
  @ kerravon
         | 
     nuclear war | 
    
    
     > > Will we have 16bit machines booted to DOS ? 
>  
> I have no idea about that, but while ever the 
> possibility exists that someone may wish to 
> manufacture 16-bit computers that boot to DOS, 
> I would like to have standards for such 
> computers organized now. 
>  
 
List of items on the minds of the survivors of a nuclear war. 
 
1) where do I find food, water & shelter. 
2) how to protect myself from those trying to kill me 
   to take my food, water & shelter. 
... 
... 
... 
1,000,000) what type of computer & OS will my great, great grand kids 
           have access to in the distant future once the electric grid 
           and internet have been rebuilt. 
 
LOL   --- --  
 http://glennmcc.org/  | 
     
                
             kerravon 
        
  
  Ligao, Free World North,  10.11.2022, 23:23                        
  @ glennmcc
         | 
     nuclear war | 
    
    
     > > > Will we have 16bit machines booted to DOS ? 
> >  
> > I have no idea about that, but while ever the 
> > possibility exists that someone may wish to 
> > manufacture 16-bit computers that boot to DOS, 
> > I would like to have standards for such 
> > computers organized now. 
> >  
>  
> List of items on the minds of the survivors of a nuclear war. 
>  
> 1) where do I find food, water & shelter. 
> 2) how to protect myself from those trying to kill me 
> to take my food, water & shelter. 
> ... 
> ... 
> ... 
> 1,000,000) what type of computer & OS will my great, great grand kids 
> have access to in the distant future once the electric grid 
> and internet have been rebuilt. 
 
That list is the same even without nuclear war. 
 
I just happen to be one of the people who is interested 
in something similar to what is on the bottom of most 
people's lists. 
 
BFN. Paul.  | 
     
                
             kerravon 
        
  
  Ligao, Free World North,  10.11.2022, 23:25                        
  @ DosWorld
         | 
     nuclear war | 
    
    
     > > And that the only people who will still be able to 
> > manufacture processors will be universities, and 
> > they will only be able to do 8-bit computers, not 
> > 16-bit. 
>  
> Computers - not need. Just use your imagination. 
> https://www.amazon.com/-/dp/0998379417/ 
>  
> I hope, author of this book will write the same book, but about Word and 
> Excel. 
>  
> PS:   
>  
> PPS: Seriously, why should humanity follow the path of technocracy again? 
> They may choose a completely different way. For example, they can see 8086 
> memory model, 68 genders, get horrified and say "never again!", and become 
> druids. 
 
I'm not saying they will or they won't. 
 
I'm only saying that while the possibility exists, 
I would like to plan for it. 
 
BFN. Paul.  | 
     
                
             kerravon 
        
  
  Ligao, Free World North,  10.11.2022, 23:45                        
  @ tkchia
         | 
     nuclear war | 
    
    
     > > I started programming in assembly with the 
> > Commodore 64 in 1984. Not a proper assembler - 
> > the equivalent of MSDOS "debug". 
>  
> OK — now try to get a self-hosted C compiler working on that. 
 
Other people did that. But why would I personally 
want to do that? I didn't say that that was 
something I personally wanted to do. I didn't 
even say that I wanted to program in C at all 
on an 8-bit machine. 
 
My 16-bit OS only really becomes practical with 
about 2 MB of memory, so I need a 16:16 machine 
with a 5-bit segment shift, or something similar 
to the 80286 will also work, and that is my 
interest and priority. 
 
I just want a set of standards to work to for 
all that. 
 
> > > With all due respect, methinks you know not whereof you speak. 
> ... 
> > What I'm interested in is what to do if/when technology 
> > reaches a 16:16 stage. 
>  
> Well, yes, if we end up in some universe where your idea might make sense, 
> then ... your idea might actually make sense.  Yes, yes, if you put it that 
> way, Mr. Captain Obvious Tautology. 
>  
> The real question here is why you think there might be a snail's 
> chance that your idea may make any sense.  Because, you know, disasters 
> have a way of not going according to our expectations or wishes.  
> And that is partly what makes them disasters. 
 
I've already outlined why - if the nuclear war goes 
a certain way, new computers will be 8-bit, and when 
new computers reach 16-bit, and memory availability 
exceeds 64k, segmentation may well be chosen as a 
solution. It's happened before. 
 
> And nowhere do you explain 
> - why this "standardization" is so important in a post-nuclear world 
 
I'm not particularly claiming that it is "important". 
I just want a standard to code to. POSIX doesn't cut it. 
 
> - why your proposed standards are any good 
 
I didn't claim that either. 
 
> - or why existing standards or existing practices somehow fall short. 
 
The only existing standard that I know of is POSIX, 
and it falls short because it is not appropriate 
for small computers like the 8086 because it 
basically requires virtual memory to support crap 
like fork(). If they remove fork() from POSIX and 
only have posix_spawn(), that may be a step in the 
right direction, but I'm not sure it is sufficient. 
 
I would be interested in your opinion if you think 
that is all that is required. 
 
Existing practices I'm not actually aware of. I 
never wrote DOS-specific software, I followed the 
C90 standard, and still do. I do know that people 
directly wrote to 0xb8000 and I also know that 
Microsoft only supported the ANSI terminal in 
Windows very recently, and I know for MSDOS they 
only ever supported ANSI output, not keyboard 
input. 
 
So I know that standard wasn't being followed. I 
follow it myself though, for fullscreen 
applications that I support on PDOS/386 (and 
recent Windows). 
 
> On a somewhat unrelated tangent: 
>  
> > And you can add to that the fact that depending 
> > on how you count, we've already had World War 3 
> > (2 hot, 1 cold), won in our favor already. And 
>  
> Well, that certainly looks like a view of history straight out of the 
> "Project 
> for the New American Empire Century".  It is easy to wax 
> lyrical about how "we" are on the "winning" side — whatever that means 
 
Yeah, some people like to pretend there was no winner 
of the Cold War. If the Soviets had actually managed 
to enslave the entire Europe, and you happened to be 
living in Western Europe when they kicked down your 
door, maybe you would understand the reality of what 
it's like to lose the Cold War. 
 
> — if "we" do not end up as collateral damage.  And there was much 
> needless collateral damage during the Cold War.  But I digress. 
 
Take it up with Mr Marx. 
 
>> I should also point out that creating an API for small 
>> (16:16) systems is not technically impossible. I have 
>> already made an opening offer here: 
>> https://sourceforge.net/p/pdos/gitcode/ci/master/tree/src/pos.c 
 
> Well... what is the niche your proposed "standard" is supposed to fill? 
 
People coding int86(...) which looks bad and 
doesn't work when upgrading to 32-bit and 
64-bit and different processors like the 
68000 that would otherwise be capable of 
running your application. 
 
> There is definitely already some sort of de facto common API that was 
> implemented across the major compilers targeting MS-DOS — including Open 
> Watcom, Microsoft C, and later versions of Borland C++. 
 
> (Edit: and Digital Mars.) 
 
> So what exactly does your new proposed standard offer? 
 
Perhaps nothing. Is there any reason why OS/2 2.0 
didn't use that same API? And 64-bit Windows? Or 
rather - could it? 
 
If there's nothing wrong with it, and the only issue 
is that ISO isn't interested in publishing a formal 
standard, so it needs to remain a "de facto common 
API", so be it, I'll probably switch to that, and 
write it in terms of Pos* calls. And perhaps write 
another version that turns them into Windows calls, 
and another version that turns them into OS/2 calls, 
and another version that turns them into POSIX 
calls. 
 
Or it could be done the other way around - take the 
Windows API and implement it for MSDOS, since 
Windows doesn't use fork(). 
 
Or it could be none of the above. That's my question. 
 
BFN. Paul.  | 
     
                
             Rugxulo 
        
  
  Usono,  11.11.2022, 09:18                        
  @ kerravon
         | 
     nuclear war | 
    
    
     > My 16-bit OS only really becomes practical with 
> about 2 MB of memory, so I need a 16:16 machine 
> with a 5-bit segment shift, or something similar 
> to the 80286 will also work, and that is my 
> interest and priority. 
 
There were 186 clones with 24-bit addressing. The 186 was still being made at least until 2007. (I believe OpenWatcom contributor Wilton Helm had much experience with embedded 186.) 
 
https://www.cpushack.com/2013/01/12/the-intel-80186-gets-turbocharged-vautomation-turbo186/ 
 
There was also the Bandai Wonderswan (NEC V30) circa 1999: 
 
https://en.wikipedia.org/wiki/WonderSwan 
 
> Is there any reason why OS/2 2.0 didn't use that same API? 
> And 64-bit Windows? Or rather - could it? 
 
Microsoft wanted to "control the standard", so to speak, but IBM fired them. They don't want to license *nix from AT&T for Xenix, for instance. They want to do their own thing. 
 
http://gunkies.org/wiki/Gordon_Letwin_OS/2_usenet_post  (circa 1995) 
 
> Or it could be done the other way around - take the 
> Windows API and implement it for MSDOS, since 
> Windows doesn't use fork(). 
 
There are lots of software patents and lawyers. While many agree that APIs can't be copyrighted, it's still a minefield. Just because they "got away" with it in the old days (e.g. PC-DOS vs. CP/M, Compaq vs. IBM BIOS) doesn't mean they wouldn't still clamp down in a heartbeat if they could. 
 
https://en.wikipedia.org/wiki/Google_LLC_v._Oracle_America,_Inc. 
 
(I don't really want to mention that, but for completeness, it's worth noting ... barely.)  | 
     
                
             marcov 
         11.11.2022, 11:09                        
  @ kerravon
         | 
     nuclear war | 
    
    
     > > Well, the only thing I can say is that universities would simply make a 
> > linear 24 or 32-bit address space or use some other better addressing 
> > scheme to access 16+ quantities (e.g. by having wider addressing 
> > registers) 
>  
> Universities won't be doing anything other than 
> producing 8-bit CPUs. It will require industry 
> to be formed to produce 16-bit CPUs. 
 
Yeah, because universities do nothing but making 8-bit CPUs yet, and all industry only makes 16-bit+ CPUs (wouldn't be surprised that it is actually the other way around, no universities doing much with 8-bit now, and the industry still making them for washing machines and the like) 
 
As said, the whole argument hangs together from these artificial border conditions that make the whole thing ludicrous. And then on top comes your weird POSIX obsession retrofitted onto Dos like it never was. 
 
Again: total fantasy world.  | 
     
                
             marcov 
         11.11.2022, 13:12                        
  @ Rugxulo
         | 
     nuclear war | 
    
    
     > There are lots of software patents and lawyers. While many agree that APIs 
> can't be copyrighted, it's still a minefield. Just because they "got away" 
> with it in the old days (e.g. PC-DOS vs. CP/M, Compaq vs. IBM BIOS) doesn't 
> mean they wouldn't still clamp down in a heartbeat if they could. 
>  
> https://en.wikipedia.org/wiki/Google_LLC_v._Oracle_America,_Inc. 
>  
> (I don't really want to mention that, but for completeness, it's worth 
> noting ... barely.) 
 
wdosx? Wine ?  | 
     
                
             kerravon 
        
  
  Ligao, Free World North,  12.11.2022, 07:11                        
  @ marcov
         | 
     nuclear war | 
    
    
     > > > Well, the only thing I can say is that universities would simply make 
> a 
> > > linear 24 or 32-bit address space or use some other better addressing 
> > > scheme to access 16+ quantities (e.g. by having wider addressing 
> > > registers) 
> >  
> > Universities won't be doing anything other than 
> > producing 8-bit CPUs. It will require industry 
> > to be formed to produce 16-bit CPUs. 
>  
> Yeah, because universities do nothing but making 8-bit CPUs yet, and all 
> industry only makes 16-bit+ CPUs (wouldn't be surprised that it is actually 
> the other way around, no universities doing much with 8-bit now, and the 
> industry still making them for washing machines and the like) 
 
I think you misunderstood what I said. 
 
If the nuclear powers deliberately take out the industrial 
cities, which they may well do, no-one knows, the only 
people CAPABLE of manufacturing new CPUs will be 
universities. 
 
But universities (today), don't have a reason (or ability) 
to manufacture the latest greatest CPUs. They have 
rudimentary capability. They can manage 8-bit CPUs. 
 
So, under the right nuclear war circumstances, the 
universities will be the centre of attraction in the 
recovering computer industry, as they will be at the 
forefront of the field. 
 
It is unclear where and how fast we will progress from 
the new world of only-new-8-bit-cpus. 
 
It may or may not transition through a 16:16 segmentation 
phase. No-one knows for sure, although some people here 
seem to think they or someone they know has an infallible 
crystal ball. 
 
I don't subscribe to crystal ball theories and leave my 
options open. 
 
BFN. Paul.  | 
     
                
             tkchia 
        
  
  12.11.2022, 08:20                        
  @ kerravon
         | 
     nuclear war | 
    
    
     Hello kerravon, 
 
> Yeah, some people like to pretend there was no winner 
> of the Cold War. If the Soviets had actually managed 
> to enslave the entire Europe, and you happened to be 
> living in Western Europe when they kicked down your 
> door, maybe you would understand the reality of what 
> it's like to lose the Cold War. 
 
I am pretty sure that people can still kick down doors with impunity in some parts of the world. 
 
Which raises the question, or rather, several questions: 
- So "we" supposedly "won" the Cold "War" — and this is, supposedly, a mightily good thing, because Karl Marx. 
- So what precisely is this "war" about again?  I thought it is about "they kick[ing] down your door", but this is still happening. 
- At what point did "we" decide, the "war" was "won", and "mission accomplished"? 
- And who exactly is this "we" anyway? 
 
Since you are apparently very fond of asking for standards and definitions, perhaps you can try to provide some. 
 
> > And nowhere do you explain 
> > - why this "standardization" is so important in a post-nuclear world 
> I'm not particularly claiming that it is "important". 
> I just want a standard to code to. POSIX doesn't cut it. 
 
Well, to put it simply: that is your problem, not the world's problem. 
 
Thank you! --- https://gitlab.com/tkchia · https://codeberg.org/tkchia · 😴 "MOV AX,0D500H+CMOS_REG_D+NMI"  | 
     
                
             kerravon 
        
  
  Ligao, Free World North,  12.11.2022, 08:43                        
  @ tkchia
         | 
     nuclear war | 
    
    
     > Hello kerravon, 
>  
> > Yeah, some people like to pretend there was no winner 
> > of the Cold War. If the Soviets had actually managed 
> > to enslave the entire Europe, and you happened to be 
> > living in Western Europe when they kicked down your 
> > door, maybe you would understand the reality of what 
> > it's like to lose the Cold War. 
>  
> I am pretty sure that people can still kick down doors with impunity in 
> some parts of the world. 
 
Yes, but not in Western Europe, because we didn't 
lose the Cold War, no matter how much you may like 
to pretend that war is very vague with no winners 
or losers to try to convince people that freedom 
has no value and we shouldn't try to win wars and 
be grateful to America (note that I'm not American). 
 
> Which raises the question, or rather, several questions: 
> - So "we" supposedly "won" the Cold "War" — and this is, supposedly, a 
> mightily good thing, because Karl Marx. 
 
Not supposedly. It really is. 
 
"we" is the free world. There really is such a thing 
as freedom, and it's not the communist definition 
which was "living under a communist dictator". 
 
> - So what precisely is this "war" about again?  I thought it is about "they 
> kick[ing] down your door", but this is still happening. 
 
Not in Western Europe because of communism. If you 
are unlucky enough to be a North Korean you can 
have your door kicked down, and get raped at one 
of Kim's parties if he chooses to do so too. 
 
> - At what point did "we" decide, the "war" was "won", and "mission 
> accomplished"? 
 
When there was no longer a major player peddling 
communism - in 1991. 
 
It's not completely won while ever there is even 
on single person saying that living under 
communist dictatorship is no big deal. 
 
> - And who exactly is this "we" anyway? 
>  
> Since you are apparently very fond of asking for standards and definitions, 
> perhaps you can try to provide some. 
 
The proper definition of "freedom" is "living under 
a rational, humanist, non-subjugating government". 
 
Other people define it as "not being a British 
colony" (e.g. when white male land-owners were 
allowed to vote in the US) or "not being a colony 
of anyone" (e.g. most African dictatorships), or 
"living under a communist dictator" (all communist 
countries). 
 
I suggest we standardize on my definition. 
 
> > > And nowhere do you explain 
> > > - why this "standardization" is so important in a post-nuclear world 
> > I'm not particularly claiming that it is "important". 
> > I just want a standard to code to. POSIX doesn't cut it. 
>  
> Well, to put it simply: that is your problem, not the world's 
> problem. 
 
I didn't claim it was the world's problem. 
 
I asked for assistance in standardizing an API 
suitable for small computers. 
 
If I am the only person who actually wants that, 
so be it. You have no way of proving that though, 
now or in the future. 
 
If you personally don't want to help, so be it too. 
 
BFN. Paul.  | 
     
                
             kerravon 
        
  
  Ligao, Free World North,  12.11.2022, 08:48                        
  @ Rugxulo
         | 
     nuclear war | 
    
    
     > > My 16-bit OS only really becomes practical with 
> > about 2 MB of memory, so I need a 16:16 machine 
> > with a 5-bit segment shift, or something similar 
> > to the 80286 will also work, and that is my 
> > interest and priority. 
>  
> There were 186 clones with 24-bit addressing. The 186 was still being made 
> at least until 2007. (I believe OpenWatcom contributor Wilton Helm had much 
> experience with embedded 186.) 
>  
> https://www.cpushack.com/2013/01/12/the-intel-80186-gets-turbocharged-vautomation-turbo186/ 
 
FANTASTIC!!! Real hardware that does exactly what I want. 
 
I will work this into my repertoire. I wasn't able 
to find the actual instruction data sheet for it 
though, but I may have enough anyway. And it's 
the concept that's important anyway. 
 
 
https://openwatcom.users.c-cpp.narkive.com/wBo3RarK/186-24-bit-addressing 
 
The only processor I know of that uses 186 24 bit addressing is the Dstni 
series, so I'm guessing that is what you are using. Yes, there is support 
for it. I implemented it a few years ago. It was broken in 1.8, but I have 
been told it has been fixed, although I haven't had a chance to test it (I 
use an older version of the linker and have been overloaded with production 
code). 
 
OP HSHIFT=8 sets it up. Note that you can also set the __HShift assembly 
variable to 8 which will make the huge memory model RTL code generate proper 
addressing. 
 
 
> There was also the Bandai Wonderswan (NEC V30) circa 1999: 
>  
> https://en.wikipedia.org/wiki/WonderSwan 
 
As far as I can tell, the NEC V30 only does 20-bit 
addressing, so I don't know the relevance. 
 
> > Is there any reason why OS/2 2.0 didn't use that same API? 
> > And 64-bit Windows? Or rather - could it? 
>  
> Microsoft wanted to "control the standard", so to speak, but IBM fired 
> them. They don't want to license *nix from AT&T for Xenix, for instance. 
> They want to do their own thing. 
>  
> http://gunkies.org/wiki/Gordon_Letwin_OS/2_usenet_post  (circa 1995) 
>  
> > Or it could be done the other way around - take the 
> > Windows API and implement it for MSDOS, since 
> > Windows doesn't use fork(). 
>  
> There are lots of software patents and lawyers. While many agree that APIs 
> can't be copyrighted, it's still a minefield. Just because they "got away" 
> with it in the old days (e.g. PC-DOS vs. CP/M, Compaq vs. IBM BIOS) doesn't 
> mean they wouldn't still clamp down in a heartbeat if they could. 
>  
> https://en.wikipedia.org/wiki/Google_LLC_v._Oracle_America,_Inc. 
>  
> (I don't really want to mention that, but for completeness, it's worth 
> noting ... barely.) 
 
Well, if that's the issue, then maybe that's what 
I can offer - my API is explicitly public domain. 
 
If that's the only public domain API to choose 
from, then are you happy with it (for small 
systems) or do you want some changes? 
 
Thanks. Paul.  | 
     
                
             tkchia 
        
  
  12.11.2022, 08:56         (edited by tkchia, 12.11.2022, 09:16)                
  @ kerravon
         | 
     nuclear war | 
    
    
     Hello kerravon, 
 
> > I am pretty sure that people can still kick down doors with impunity in 
> > some parts of the world. 
> Yes, but not in Western Europe, because we didn't 
> lose the Cold War, no matter how much you may like 
> to pretend that war is very vague with no winners 
> or losers to try to convince people that freedom 
> has no value and we shouldn't try to win wars and 
> be grateful to America (note that I'm not American). 
 
OK: 
 
- So you are saying "we" is limited to Western Europe.  And perhaps parts of Oceania.  Never mind the impact, good or bad, that this "winning" of "ours" has on the rest of the world.  The important thing is that "we" "won", whatever that means.  Never mind anyone who is not "we". 
 
- Also, I am sure people still kick down doors with impunity even in America.  Actually I have heard that people can shoot people dead — without having to kick down doors — and do so with impunity.  In America.  Or maybe those cases do not count because we cannot blame them on Karl Marx? 
 
So again, what is it this Cold "War" is about, that "we" supposedly "won"? 
 
> > > I just want a standard to code to. POSIX doesn't cut it. 
> > Well, to put it simply: that is your problem, not the world's 
> > problem. 
> I didn't claim it was the world's problem. 
> I asked for assistance in standardizing an API 
> suitable for small computers. 
 
The very idea of creating a "standard" is to offer something to the world.  If your only motivation for proposing a standard is because "I" (i.e. you) want it, then you are doing it wrong. 
 
Thank you! --- https://gitlab.com/tkchia · https://codeberg.org/tkchia · 😴 "MOV AX,0D500H+CMOS_REG_D+NMI"  | 
     
                
             kerravon 
        
  
  Ligao, Free World North,  12.11.2022, 09:16                        
  @ tkchia
         | 
     nuclear war | 
    
    
     > Hello kerravon, 
>  
> > > I am pretty sure that people can still kick down doors with impunity 
> in 
> > > some parts of the world. 
> > Yes, but not in Western Europe, because we didn't 
> > lose the Cold War, no matter how much you may like 
> > to pretend that war is very vague with no winners 
> > or losers to try to convince people that freedom 
> > has no value and we shouldn't try to win wars and 
> > be grateful to America (note that I'm not American). 
>  
> OK: 
>  
> - So you are saying "we" is limited to Western Europe. 
 
Nope, there's free people everywhere, like South Korea 
and Taiwan. 
 
> And perhaps parts 
> of Oceania.  Never mind the impact, good or bad, that this "winning" of 
> "ours" has on the rest of the world.  The important thing is that "we" 
> "won", whatever that means. 
 
It has a meaning, even if you like to pretend 
it doesn't. 
 
> Never mind anyone who is not "we". 
 
They are also helped by not having communist 
dictators harming humanity. 
 
> - Also, I am sure people still kick down doors with impunity even in 
> America.  Actually I have heard that people can shoot people dead — 
> without having to kick down doors — and do so with impunity.  In America. 
> Or maybe those cases do not count because we cannot blame them on Karl 
> Marx? 
 
Are you talking about people doing things illegally 
or legally? 
 
Of course crime exists everywhere in the world. When 
you have a dictator, it's the government doing the 
crime. You can't report their crimes to the police. 
 
If Uday Hussein abducted you off an Iraqi street and 
raped you, that's your bad luck. The police are on his 
side. You were raped by your own government instead of 
being protected by it. And men had their tongues cut 
out, with genuine impunity. I can show you video of 
Iraqi men having their tongues cut out if you'd like 
to continue to insist that there is no concept of 
freedom. 
 
If you have evidence of an American breaking American 
law, please report him or her to the American police 
and the free American media and let him or her face 
American justice. 
 
Note that courts don't always give you the ruling you 
hoped for, but it's the best we know how to actually 
do. 
 
Communist dictatorships are not the best we know how 
to do. 
 
> So again, what is it this Cold "War" about that "we" supposedly "won"? 
 
Freedom from communist state-slavery. 
 
> > > > I just want a standard to code to. POSIX doesn't cut it. 
> > > Well, to put it simply: that is your problem, not the world's 
> > > problem. 
> > I didn't claim it was the world's problem. 
> > I asked for assistance in standardizing an API 
> > suitable for small computers. 
>  
> The very idea of creating a "standard" is to offer something to the 
> world.  If your only motivation for proposing a standard is because "I" 
> (i.e. you) want it, then you are doing it wrong. 
 
You haven't established that the world is not 
being offered anything. You just stated it. 
 
I certainly want it. I don't know who wants it 
currently or who will want it in the future. 
 
Most people right now are probably not interested 
in standardizing the OS API for small systems. 
 
They may be in the future after a nuclear war when 
small systems become relevant again. 
 
And having a standard API will help, in my opinion, 
based on what happened in the past. 
 
I may have it wrong though - do you think small 
systems (in the past, and in the *possible* future) 
would or would not have benefitted from a 
standard API? 
 
BFN. Paul.  | 
     
                
             tkchia 
        
  
  12.11.2022, 09:39                        
  @ kerravon
         | 
     nuclear war | 
    
    
     Hello kerravon, 
 
> > And perhaps parts 
> > of Oceania.  Never mind the impact, good or bad, that this "winning" of 
> > "ours" has on the rest of the world.  The important thing is that "we" 
> > "won", whatever that means. 
> It has a meaning, even if you like to pretend 
> it doesn't. 
 
Well, OK: this "victory" of "ours" in the "Cold War" is all sunshine and roses, we just need to furiously turn a blind eye to all those parts that are not sunshine and roses.  Truly a glorious victory. 
 
(By the way, in my part of the world we just call it "the dissolution of the Soviet Union".) 
 
> > The very idea of creating a "standard" is to offer something to the 
> > world.  If your only motivation for proposing a standard is because 
> "I" 
> > (i.e. you) want it, then you are doing it wrong. 
> You haven't established that the world is not 
> being offered anything. You just stated it. 
 
Well, you are the one proposing a "standard", and there is this thing in the world called the "burden of proof".  To wit: the burden is on you to demonstrate that your proposed standard is actually useful to the world.  The onus is not on the rest of us to prove to you why we do not need your "standard". 
 
If I try to sell you stuff, is it your responsibility to "establish" to me why you do not need to buy my stuff?  Of course not; the very idea is absurd. 
 
Thank you! --- https://gitlab.com/tkchia · https://codeberg.org/tkchia · 😴 "MOV AX,0D500H+CMOS_REG_D+NMI"  | 
     
                
             kerravon 
        
  
  Ligao, Free World North,  12.11.2022, 09:59                        
  @ tkchia
         | 
     nuclear war | 
    
    
     > Well, OK: this "victory" of "ours" in the "Cold War" is all sunshine and 
> roses, we just need to furiously turn a blind eye to all those parts that 
> are not sunshine and roses.  Truly a glorious victory. 
 
We're not turning a blind eye. We're doing our 
best to free the rest of the world too. 
 
It's very difficult. There aren't a lot of tools 
available. 
 
> > > The very idea of creating a "standard" is to offer something to the 
> > > world.  If your only motivation for proposing a standard is 
> because 
> > "I" 
> > > (i.e. you) want it, then you are doing it wrong. 
> > You haven't established that the world is not 
> > being offered anything. You just stated it. 
>  
> Well, you are the one proposing a "standard", and there is this thing in 
> the world called the "burden of proof".  To wit: the burden is on you to 
> demonstrate that your proposed standard is actually useful to the 
> world.  The onus is not on the rest of us to prove to you why we do not 
> need your "standard". 
 
Wrong. I'm not claiming that a standard is or 
isn't useful to anyone besides me. I have no 
idea. I just know that at a minimum one person 
wants it. Absolute bare minimum. 
 
You are the one making the claim that no-one else 
in the entire world now or in the future will ever 
have a use for a standardized API for small 
computer systems. So the burden of proof is on you. 
 
Good luck proving a negative. You'll be the first 
person in history. 
 
> If I try to sell you stuff, is it your responsibility to "establish" to me 
> why you do not need to buy my stuff?  Of course not; the very idea is 
> absurd. 
 
Sorry, you're the one making wild claims, not me. 
 
Burden is on you. 
 
BFN. Paul.  | 
     
                
             tkchia 
        
  
  12.11.2022, 10:10                        
  @ kerravon
         | 
     nuclear war | 
    
    
     Hello kerravon, 
 
By the way: 
 
> (2 hot, 1 cold), won in our favor already. And 
> I count the "War on Terror" as World War 4 too. 
> Yet another ideological war (the same as 3). 
> But to actually beat "terror" requires a 
> comprehensive war covering a ridiculous number 
> of ideologies and even ideas, and at an 
> individual level, not just a leadership level. 
 
I find this "War on Terror" terminology even worse and more vague than the whole "Cold War" thing.  At least the Cold War had a definite end point — the dissolution of the Soviet Union. 
 
What does it even mean to "win" a "War on Terror"?  I can understand waging specific wars on specific groups, such as Al-Qaeda, or the Islamic State, or the Taliban.  But how does one stamp out all possible "terrorist" activity in the past, present, and even future?  At which point can one truly declare, "mission accomplished"? 
 
Nobody speaks of a "War on First-Degree Murder" or a "War on Drunk Driving Accidents", in the same vein as one speaks (or spoke) of a "War on Terror".  Why? 
 
Really... think about these things. 
 
Thank you! --- https://gitlab.com/tkchia · https://codeberg.org/tkchia · 😴 "MOV AX,0D500H+CMOS_REG_D+NMI"  |