Back to home page

DOS ain't dead

Forum index page

Log in | Register

Back to index page
Thread view  Board view
ho1459

Homepage E-mail

Germany,
14.03.2008, 14:28
 

HX-DOS Extender & Virtual Pascal 2.1.279 (DOSX)

Hi everyone,

since I had some troubles getting this to work properly and didn't find any other reference on this I decided to share my findings here.

I had the problem that VP in DOS with HX started to compile but hung up my machine after some seconds. This was due to the fact that VP needs a big amount of file handles (FILES= in config.sys) depending on the size of the project. I had to raise the handles to 200. This is explained in the docs somewhere, but not explicitly for VP.

I would also like to share my VP/HX setup here, it is based/tested on VP 2.1.279 and HX DPMILD32 3.2.0. The archive should be uncompressed to ?:\VP\ and doesn't overwrite any existing files:
http://www.bnhof.de/~ho1459/files/vp_21_279_d32_hx.zip

Bye,
Stefan / AH

Japheth

Homepage

Germany (South),
15.03.2008, 11:26

@ ho1459
 

HX-DOS Extender & Virtual Pascal 2.1.279

> I had the problem that VP in DOS with HX started to compile but hung up my
> machine after some seconds. This was due to the fact that VP needs a big
> amount of file handles (FILES= in config.sys) depending on the size of the
> project. I had to raise the handles to 200. This is explained in the docs
> somewhere, but not explicitly for VP.

Yes (HX\DOC\COMPAT.TXT).

Having to open up to 200 files concurrently sounds like a design flaw. Can't this be changed in the VP source?

---
MS-DOS forever!

Rugxulo

Homepage

Usono,
16.03.2008, 19:15

@ Japheth
 

HX-DOS Extender & Virtual Pascal 2.1.279

> > This was due to the fact that VP needs a big
> > amount of file handles (FILES= in config.sys) depending on the size of
> > the project. I had to raise the handles to 200. This is explained in the
> > docs somewhere, but not explicitly for VP.
>
> Yes (HX\DOC\COMPAT.TXT).
>
> Having to open up to 200 files concurrently sounds like a design flaw.
> Can't this be changed in the VP source?

I'm curious to know what DOS was tested that needs such. Maybe other DOSes are more forgiving in that regard (e.g. if less works for Laaca)??

BTW, VP isn't open source, so you're at the mercy of the original author (and I think it's more or less unsupported / discontinued).

ho1459

Homepage E-mail

Germany,
17.03.2008, 21:07

@ Rugxulo
 

HX-DOS Extender & Virtual Pascal 2.1.279

Hi Rugxulo,

nice to see you here too!

> I'm curious to know what DOS was tested that needs such. Maybe other DOSes
> are more forgiving in that regard (e.g. if less works for Laaca)??

I am using MS-DOS 7.0 (W98SE), and I have tried a clean configuration w/o any memory driver, one with XMS only and one with QEMM 8.0.
Only changing the files made it work, no matter in what configuration.

VP opens a lot of files in very short time, and it is a very fast compiler.
Maybe the authors sacrificed a big amount of file handles in favor of a very fast/dirty compilation algorithm - I don't know.


All the best,
Stefan / AH

marcov

07.04.2008, 11:45
(edited by marcov, 07.04.2008, 15:35)

@ Rugxulo
 

HX-DOS Extender & Virtual Pascal 2.1.279

> BTW, VP isn't open source, so you're at the mercy of the original author
> (and I think it's more or less unsupported / discontinued).

I'm actually a FPC developer (and no, not for Dos :-), but I've had contacts with VP's owner (and maintainer in the last years) several times.

And yes, Alan pulled the plug, this after several times to build a community to open source it. These communities failed to adequately maintain the RTL (though specially Veit K. did a good job), let alone the compiler, linker and debugger.

And the core compiler is in pure assembler, 6MB of them, and Alan himself was unable to do major restructuring/enhancements in them. (a different programmer named Vitaly Miryanov did the original compiler work in the mid nineties). I've seen them, and they are totally unmaintainable. That sizeable body of code without much documentation (about internals I mean, not enduser) or comments...
Keep in mind that also the compiler of FPC was improving rapidly. Any team that can't get a new Dos/windows/Linux release of FPC up to release quality would be a total loss in the VP source.

While the IDE is in pascal and generally also interesting, it is too entangled with the assembler parts, and afaik also with copyrighted (Borland TV, the Pascal code is not free, the C++ code is different) code.

The only remaining interest would be some of the more arcane OS/2 related parts in debugger and linker, but to other projects.

Alan and 2-5 users are still reachable via VPascal.ning.com though, but the activity there is low.

These are also typical problems of open sourcing large corporate codebases, specially if they are from the nineties or earlier: quite unmaintable, and using commercial components that have to be replaced. Both requiring near complete rewrites. This is also why the heavily sponsored Mozilla project took years to get to the first milestone that was somewhat comparable in quality to the Netscape 4.x browsers (Milestone 18 to be exact)

Rugxulo

Homepage

Usono,
07.04.2008, 23:27

@ marcov
 

HX-DOS Extender & Virtual Pascal 2.1.279

> > BTW, VP isn't open source, so you're at the mercy of the original author
> > (and I think it's more or less unsupported / discontinued).

(BTW, I meant that it's impossible for us or anybody but Allan to fix the FILES= bug because of this.)

> I'm actually a FPC developer (and no, not for Dos :-), but I've had
> contacts with VP's owner (and maintainer in the last years) several
> times.

Yes, I recognize you from other forums (e.g. FreeBASIC).

> And yes, Alan pulled the plug, this after several times to build a
> community to open source it.

I am not a Pascal user, and I don't use VP, so take this with a grain of salt:

Why would you "pull the plug" just because "not enough" people use it? Isn't that a surefire way to minimize number of users? Besides, isn't a handful more than zero? (I don't understand the details, obviously.)

> These communities failed to adequately
> maintain the RTL (though specially Veit K. did a good job), let alone the
> compiler, linker and debugger.

You have to give people time. And yes, some projects have more interest than others (e.g. Linux). Sometimes it takes a while before people start messing with things with their little hacks and improvements.

> And the core compiler is in pure assembler, 6MB of them, and Alan himself
> was unable to do major restructuring/enhancements in them. (a different
> programmer named Vitaly Miryanov did the original compiler work in the mid
> nineties). I've seen them, and they are totally unmaintainable.

What has to be done? (Not what would be nice, but what is a major bug, major flaw, etc.??)

> Keep in mind that also the compiler of FPC was improving rapidly. Any team
> that can't get a new Dos/windows/Linux release of FPC up to release quality
> would be a total loss in the VP source.

I don't personally understand why FPC has semi-dropped DOS support. Doesn't anybody use DOS anymore? Is it really that hated nowadays? Are other platforms (GBA, DS, WinCE) really?? more important? Did DOS coders forget everything they knew? (Doubtful.) So I don't get it. Of course, I'm not trying to be impatient (developers often have several things they are working on), just slightly in shock that working code falls to pieces.

> While the IDE is in pascal and generally also interesting, it is too
> entangled with the assembler parts, and afaik also with copyrighted
> (Borland TV, the Pascal code is not free, the C++ code is different)
> code.

Yes, obviously, that has to be omitted in any public release.

> The only remaining interest would be some of the more arcane OS/2 related
> parts in debugger and linker, but to other projects.

Well, nobody has (yet) to write a FreeOS/2 (although such projects were barely barely started). Even FreeDOS still has plenty of room for improvement (and it's a big success, IMO). But yeah, how could it hurt to release only parts? At worst, it helps anybody wishing to learn a little assembly (e.g. size or speed tricks), at best it helps them implement something else useful (FreeOS/2??).

> Alan and 2-5 users are still reachable via VPascal.ning.com though, but
> the activity there is low.

Well, (no offense, not meant directed at you/yours) but even a discontinued "hard-to-maintain" working compiler beats a compiler that used to work but doesn't anymore. So at least it works and is free, so I can't complain. (NDN rocks!)

> These are also typical problems of open sourcing large corporate
> codebases, specially if they are from the nineties or earlier: quite
> unmaintable, and using commercial components that have to be replaced.
> Both requiring near complete rewrites. This is also why the heavily
> sponsored Mozilla project took years to get to the first milestone that
> was somewhat comparable in quality to the Netscape 4.x browsers (Milestone
> 18 to be exact)

Yes, of course! Plenty of projects (ahem, FreeDOS) had a lot of work to do to get up to snuff "back in the day". It's not easy, but you have to start somewhere (if ever). If it's worth it, even if hard, then it should be done. But starting is half the process and finishing is the other (difficult) half. "It's always difficult before it becomes easy."

P.S. If you're somehow implying that assembly is harder to maintain, that's not true. And plenty of people know (some variant) of Intel assembly. However, I dunno what assembler was used or how "high level" (e.g. MASM) VP needs. So whatever. (Disclaimer: I know some assembly, but I'm far far far from being a guru.)

---
Know your limits.h

marcov

08.04.2008, 13:28

@ Rugxulo
 

HX-DOS Extender & Virtual Pascal 2.1.279

> (BTW, I meant that it's impossible for us or anybody but Allan to fix the
> FILES= bug because of this.)

Assuming that is a RTL problem, there might be more. Best is to start detailing the problem and the fix, and present it as a complete case study on the mentioned site. But the important question is what is the use?

> Why would you "pull the plug" just because "not enough" people use it?

He didn't do that. He pulled the plug because not enough people were contributing, and he had no time anymore. Two attempts to set up a community failed. People were all

> Isn't that a surefire way to minimize number of users?

Maybe, but that is not the point.

> Besides, isn't a handful more than zero? (I don't understand the details, obviously.)

Two possible answers:

1)
This is all not about the users, but about developers. No more developers, two _year_ long attempts to find new ones failed, copyright problems hinder open sourcing it as-is. (which would be a faint, but non-zero chance of somebody pick it up)

2)
And even if the number of users is large than zero, what is that supposed to signify? As long as you have more than one person pretending to be user, you are obliged to keep supporting the thing indefinitely?

> You have to give people time.

Two years and longer. Before VP was in maintenance mode pretty much since 2000.

> What has to be done? (Not what would be nice, but what is a major bug,
> major flaw, etc.??)

During those repeated attempts, most remaining users were interested in new development. Dynamic array support, overloading and int64 support (iow features beyond basic D2 support), and ports to new platforms. However in reality new development would have to replace the copyrighted parts first (including Turbo Vision, and rewriting the IDE to accomodate to changes)

> I don't personally understand why FPC has semi-dropped DOS support.
> Doesn't anybody use DOS anymore?

Very few. But worse, even less _invest_ in it. Not FPC or VP dropped Dos, but the Dos users did, by not participating in significant enough numbers.

> Is it really that hated nowadays? Are
> other platforms (GBA, DS, WinCE) really?? more important?

All platforms are maintained by enough people working hard enough on them to keep it in a releasable state and to build releases, and in general maintain the knowledge about them. Some like Linux and XP that is already done by the core developers, some (like all that you name, but e.g. also OS X that Jonas maintains, or FreeBSD what I do) are kept on their feet by their respective maintainers.

No maintainers, and nobody new for a while? -> no releases can be built.

Somehow this loss for maintainers is more frequent for the MS platforms than for others. I personally think this is not so much MS, but more that the users on those platforms are more used to being "customers", iow not take part in development.

> Did DOS
> coders forget everything they knew? (Doubtful.) So I don't get it. Of
> course, I'm not trying to be impatient (developers often have several
> things they are working on), just slightly in shock that working code
> falls to pieces.

Working code must be kept working. Otherwise you get the problems as with the 1.9.x series of Dos releases (when we still released dos if we could) with a series of broken releases. All breakages by new development were fixed, but nobody debugged it, resulting in poor packaged releases of poor quality.

Maybe this string now comes to an end with finally a decent 2.2.2 release, because I saw that Laaca (also here) submitted some patches. But that is something FPC specific, that for now the rest of the project is still very alive, and has a die hard core to do the non-platform specific part.

This all has happened before, e.g. with BeOS that has been dead for several years, before Olivier started working on it, and also with AmigaOS.

And VP faced a huge "cleanup" stage first before a open source version could be released. The owner/maintainer tried to get this started a few times, and then decided he had invested enough time in it.

And I don't blame him for that.

(copyright)
> Yes, obviously, that has to be omitted in any public release.

Which breaks the IDE and debugger, something the users most extremely valued (e.g. when compared to FPC those are the strong points of VP)

> Well, (no offense, not meant directed at you/yours) but even a
> discontinued "hard-to-maintain" working compiler beats a compiler that
> used to work but doesn't anymore.

You can still download the archive (see the above link). So there is no problem there. There is simply no development anymore. For that a significant investment must be made, and because the current situation can never be free, it would take years for the first usable version reached users.

Rugxulo

Homepage

Usono,
08.04.2008, 15:58

@ marcov
 

HX-DOS Extender & Virtual Pascal 2.1.279

> > I don't personally understand why FPC has semi-dropped DOS support.
> > Doesn't anybody use DOS anymore?
>
> Very few. But worse, even less _invest_ in it. Not FPC or VP dropped Dos,
> but the Dos users did, by not participating in significant enough
> numbers.

A lot of people just aren't aware that FreeDOS is active and could use testers. They (especially Eric) have done a lot. It's not as bad a platform as it used to be (although there's always room for improvement).

And seriously, I find it hard to believe that with all the talented Linux coders out there, that no one has the skills to port more stuff to DOS?

> Somehow this loss for maintainers is more frequent for the MS platforms
> than for others. I personally think this is not so much MS, but more that
> the users on those platforms are more used to being "customers", iow not
> take part in development.

Well, it's true, a lot of people don't have the skills (e.g. mine are fairly limited). And of course you (usually) develop for what OS you use the most. But some groups only seem to target the "latest and greatest" (i.e. only "modern" techniques or x86-64 hardware, etc), which boggles the mind. "Oh, well, C99 exists, and even though we don't need it, let's require it anyways." (Meh.)

> And VP faced a huge "cleanup" stage first before a open source version
> could be released. The owner/maintainer tried to get this started a few
> times, and then decided he had invested enough time in it.
>
> And I don't blame him for that.

He can do what he wants with his free time. I don't blame him at all. But it seems silly to complain (not that anyone really is, AFAICT) about lack of developers when there isn't anything publicly available to develop.

> (copyright)
> > Yes, obviously, that has to be omitted in any public release.
>
> Which breaks the IDE and debugger, something the users most extremely
> valued (e.g. when compared to FPC those are the strong points of VP)

IDE isn't so important, debugger is but can be lived without (some people never use 'em!).

> > Well, (no offense, not meant directed at you/yours) but even a
> > discontinued "hard-to-maintain" working compiler beats a compiler that
> > used to work but doesn't anymore.
>
> You can still download the archive (see the above link). So there is no
> problem there. There is simply no development anymore. For that a
> significant investment must be made, and because the current situation can
> never be free, it would take years for the first usable version reached
> users.

I personally think "enhancements" can be ignored. If you want > Delphi 2 functionality, use Delphi (or FPC or GPC or whatever). So, I wouldn't blame him if he didn't bother adding new features. But even releasing the runtime library sources could greatly benefit somebody in the future (although I dunno who specifically).

marcov

10.04.2008, 10:02

@ Rugxulo
 

HX-DOS Extender & Virtual Pascal 2.1.279

> > Very few. But worse, even less _invest_ in it. Not FPC or VP dropped
> Dos, but the Dos users did, by not participating in significant enough
> > numbers.
>
> A lot of people just aren't aware that FreeDOS is active and could use
> testers. They (especially Eric) have done a lot. It's not as bad a
> platform as it used to be (although there's always room for improvement).

If it is so active, then when didn't they step up? Either they are not interested in these packages, (except for Laaca), or have no time.

Either way you remain in the situation that the tool developers have no direct need for dos, and that the dosers don't have a direct need for the tool. Or how else do you explain the relative inactivity of Dos ports for FPC and VP then? (GPC is still maintained for DJGPP I believe by M. Lombardi, but that's the same guy that did it 10 years ago too, without much help)

Apparently, Dos still has compilers enough, or sb would step up

> And seriously, I find it hard to believe that with all the talented Linux
> coders out there, that no one has the skills to port more stuff to DOS?

Sure. Microsoft probably also had enough people that _could_ maintain dos, but also they don't.

The trick is finding people with interest _and_ skills for these. And they usually have to come from the users circles, since Dos has disappeared from everyday life, and users circle like this one are the only source of dos users.

> > And I don't blame him for that.
>
> He can do what he wants with his free time. I don't blame him at all. But
> it seems silly to complain (not that anyone really is, AFAICT) about lack
> of developers when there isn't anything publicly available to develop.

This is unfair to Allan. Allan spent two years of very busy time (he was moving across countries) to try to build some community.

> > Which breaks the IDE and debugger, something the users most extremely
> > valued (e.g. when compared to FPC those are the strong points of VP)
>
> IDE isn't so important, debugger is but can be lived without (some people
> never use 'em!).

All then remaining VP users considered them of the utmost importantcy. It was the main reason to use VP over FPC.

> So, I wouldn't
> blame him if he didn't bother adding new features. But even releasing the
> runtime library sources could greatly benefit somebody in the future
> (although I dunno who specifically).

The runtime library was also not copyright free. One of the larger chunks (except for TV) was sysutils. A project to port sysutils from FPC failed, due to lacking skills, and a lot of noise on the maillist about what should be done. I tried to help several times, but got flamed for being to FPC centric. Which was true, but the only sane way. Any major changes would upset the stability which was VP's only asset.

But the users kept arguing, and the discussion got crazier and crazier, planning the most difficult things (including rewriting the compiler), but were not able at the same time to port (not develop from scratch, just grab from FPC!) even a relatively simple piece of source like sysutils. This was the third time this happened to an attempt to revive it, so Allan gave up.

Rugxulo

Homepage

Usono,
11.04.2008, 07:14

@ marcov
 

FPC for DOS / FreeDOS

> > A lot of people just aren't aware that FreeDOS is active
>
> If it is so active, then when didn't they step up? Either they are not
> interested in these packages, (except for Laaca), or have no time.

In case you haven't noticed, FreeDOS is mostly written in C (with a little ASM here and there). The official compiler is OpenWatcom, and the official assembler is NASM. It is (so far) more focused on mirroring MS-DOS compatibility ("BASE") than adding new features (although that happens a lot too). Plus, GPL is strongly preferred. As is, though, nothing much of theirs uses Pascal (although they include NDN in "UTIL" and Blocek in "EDIT"). So, their focus is mirroring what people already are familiar with, and lacking any reason to focus on FreePascal (since MS didn't have one), they haven't done so. Besides, they mostly do it all in their spare time.

The core people are more interested in refining things, updating webpages, online docs, FAQs, wiki, etc., improving the overall quality and packaging, HTML documentation, .ISOs, backing up files on iBiblio, etc. (And they migrated from CVS to SVN, Bugzilla 2 to Bugzilla 3.) It's also very hard to get things done (i.e. perfectly finished) when people move, are too busy, or have other priorities. And yet still, they get things done (just slower than we'd all like, I guess). It takes a lot more effort to polish, bugtest, and document something than just throw it together in one big hackfest.

> Either way you remain in the situation that the tool developers have no
> direct need for dos, and that the dosers don't have a direct need for the
> tool. Or how else do you explain the relative inactivity of Dos ports for
> FPC and VP then? (GPC is still maintained for DJGPP I believe by M.
> Lombardi, but that's the same guy that did it 10 years ago too, without
> much help)
>
> Apparently, Dos still has compilers enough, or sb would step up

Stefan Weber uses VP.
Laaca uses FP.
Jason Burgon uses BP7.
Jason Sinclair uses TP7.

Others (MegaBrutal, Eric Auer) seem to occasionally use TP55 (free version).

Maybe you should tell us exactly why FPC is better than all the others? (Obviously: better license, more portable.) Or at least give us some idea of what HAS to be fixed. It must just need more publicity, then.

> > > And I don't blame him for that.
> >
> > He can do what he wants with his free time. I don't blame him at all.
> But
> > it seems silly to complain (not that anyone really is, AFAICT) about
> lack
> > of developers when there isn't anything publicly available to develop.
>
> This is unfair to Allan. Allan spent two years of very busy time (he was
> moving across countries) to try to build some community.

So VP is dead and FP for DOS is comatose? Not good. :-(

I'm not blaming him, I don't know all the details (obviously). It just seems like he'd release whatever he legally could and "let the users worry with it". That's what most developers these days do. But whatever, I don't care either way.

> > IDE isn't so important, debugger is but can be lived without (some
> people
> > never use 'em!).
>
> All then remaining VP users considered them of the utmost importantcy. It
> was the main reason to use VP over FPC.

VP isn't really DOS-based, and yet you're talking to me (a DOS user), so obviously I consider functionality over style and graphics. I'm more in favor of a working cmdline compiler open-sourced than none at all. All the other cruft can come later. For sure, you can live without an IDE.

> > So, I wouldn't
> > blame him if he didn't bother adding new features. But even releasing
> the
> > runtime library sources could greatly benefit somebody in the future
>
> The runtime library was also not copyright free. One of the larger chunks
> (except for TV) was sysutils. A project to port sysutils from FPC failed,
> due to lacking skills, and a lot of noise on the maillist about what
> should be done.

Well, obviously FPC trumps VP in a lot of ways (more ports, better compatibility, open source). I can understand your extreme pessimism re: VP in the future (and I have no huge hopes for it), but FPC for DOS hopefully won't collapse like that. I mean, it's GPL, how can it?!

> But the users kept arguing, and the discussion got crazier and crazier,
> planning the most difficult things (including rewriting the compiler), but
> were not able at the same time to port (not develop from scratch, just grab
> from FPC!) even a relatively simple piece of source like sysutils. This was
> the third time this happened to an attempt to revive it, so Allan gave up.

Has he tried FPC? Does he like it? Maybe he can focus his energy into making that better instead of worrying about other stuff. (Although I still say any asm code he has might potentially be useful, but whatever ....)

marcov

11.04.2008, 13:08

@ Rugxulo
 

FPC for DOS / FreeDOS

(snip freedos development model summary)

Yes, I know all that. But I was not talking about the core FreeDOS programmer, but rather the community at large.

Btw, there was a Microsoft Pascal, but the last version is from the eighties. (Windows 1.0 is rumoured to have been in Pascal, though it is hard to separate real facts from assumptions that result from of stdcall being a pascal like calling convention)

(still freedos:)
> The core people are more interested in refining things, updating webpages,
> online docs, FAQs, wiki, etc.,

And that is the same thing as with FPC. The core focusses on the core of the project. The outer reaches, like finishing touch, additional libraries etc, are done by the people actually interested in it.

> Stefan Weber uses VP. Laaca uses FP. Jason Burgon uses BP7. Jason Sinclair uses TP7.

> Others (MegaBrutal, Eric Auer) seem to occasionally use TP55 (free
> version).
>
> Maybe you should tell us exactly why FPC is better than all the others?

I can give you the opposite. The main problems with FPC:
- of all versions, a major weak point is that debugging is at a lower level and requires some retraining. Actually GDB and stuff like valgrind is more powerful than e.g. the TP debugger, but the TP debugger is more reliable and easier to learn/use.
- The later 1.0.x versions were pretty decent, but after 2000-2001 nobody invested much time in the dos port. Some minor things were fixed, and the basic system is workable again. However it requires simply some attention and maintaining to become a finished product again. And that is a pity, because the 2.x series is so much better in any way (rivaling Delphi rather than Turbo Pascal in speed and language)

IMHO in all other things FPC is better. The most crucial one being that FPC (and maybe GPC) are the only ones with any future at all.

> (Obviously: better license, more portable.) Or at least give us some idea
> of what HAS to be fixed. It must just need more publicity, then.

It needs a few people that simply use it, find bugs and describe them, one of which is able to fix most bugs himself, specially if they are dos specific (like DPMI exception handling etc, or being able to get a new version of GDB to work).

Note that Dos isn't the only platform with this problem. All the older and fringe platforms (OS/2, BeOS, AmigaOS) suffer from this.

If you really want the IDE back up to snuff, add some additional people that are capable of handling the textmode IDE. For this item, the same problem as Dos goes. The core only has time for minimal maintaining. Maybe not needed for you, but it will be if you want to get some users/testers interested.

> So VP is dead and FP for DOS is comatose? Not good. :-(

And GPC is also comatose, and maybe not even for just Dos. However it might get a breath of fresh air again, there is nothing fundamentally wrong there (like it is with VP). It could be simply something like the two maintianers being temporarily busy.

> I'm not blaming him, I don't know all the details (obviously). It just
> seems like he'd release whatever he legally could and "let the users worry
> with it". That's what most developers these days do. But whatever, I don't
> care either way.

That's what a lot of people said. But making up that balance was a monumental task in itself, and the chances of getting anything done were remote. He expected it would only lead to false hope, and directed the users to FPC to not fragment the rest some more. Have a look at:

http://web.archive.org/web/20060312071305/www.vpascal.com/comment.php?comment.news.16

> For sure, you can live without an IDE.

Then you are the only one in VP circles. And I can live without an IDE, but that's mainly because when you debug the system itself, additional layers only add to confusion. For my day job I use IDEs (Delphi to be specific, and a bit Lazarus on the side). Just to keep some productivity.

> Has he tried FPC? Does he like it? Maybe he can focus his energy into
> making that better instead of worrying about other stuff.

He quit the whole business, since not having the time anymore for a long consequtive was one of the reasons to stop in the first place. He directed the other users to VP, something that btw none of the users that tried to "fix" VP did. Which is something I regret, specially Veit K., since he was already a long way of being a valuable core member.

> (Although I still say any asm code he has might potentially be useful, but whatever

The trouble is that he had to do the hard work for your "potential" use. And that potential was considered negiable by all that have seen the source.

Rugxulo

Homepage

Usono,
11.04.2008, 15:50

@ marcov
 

FPC for DOS / FreeDOS

> (snip freedos development model summary)
>
> Yes, I know all that. But I was not talking about the core FreeDOS
> programmer, but rather the community at large.

It is a very very fractured community, they are not organized. They are split in a million pieces, and they don't all stay up-to-date. (It's a big, big world.)

> Btw, there was a Microsoft Pascal, but the last version is from the
> eighties. (Windows 1.0 is rumoured to have been in Pascal, though it is
> hard to separate real facts from assumptions that result from of stdcall
> being a pascal like calling convention)

I know, but it wasn't / isn't free and didn't come with DOS by default (although QBASIC did). And FreeBASIC has tons and tons of old QB users.

> - The later 1.0.x versions were pretty decent,

Better than 2.0.4? More stable?? (Laaca, any comments?) That would be hard to believe, but I have heard rumors. So what has been the most stable DOS version, anyways, 1.0.10?? (Ignoring IDE and debugging but re: actual compiling.)

> IMHO in all other things FPC is better. The most crucial one being that
> FPC (and maybe GPC) are the only ones with any future at all.

Yes, obviously, and that is due to being GPL. (But of course C/C++ is king, for good or bad.)

> >(Although I still say any asm code he has might potentially be useful,
> but whatever
>
> The trouble is that he had to do the hard work for your "potential" use.
> And that potential was considered negiable by all that have seen the
> source.

FASM is fully written in ASM, and it has many users. One guy even got it targeting ARM! And it (now) runs on about 9 (or 10?) OSes natively. So I would consider that a success. And the forum is quite active. It doesn't mean that its author gets tons of patches from users, but they do use it. So assembly is not dead, and certainly people still use it. It's not as impossible to understand as some people pretend (at least, not at the bare instruction level ... other stuff may be a bit more complex).

But I can definitely understand that some things take too much time, are too complex, etc. Plus, motivation is a rare thing, and it's hard to make yourself do things sometimes. :-(

Japheth

Homepage

Germany (South),
11.04.2008, 17:22

@ Rugxulo
 

FPC for DOS / FreeDOS

> And the <FASM> forum is quite active.

I can confirm this. It's active and alive like a KINDERGARTEN.

---
MS-DOS forever!

marcov

11.04.2008, 19:50

@ Rugxulo
 

FPC for DOS / FreeDOS

> > - The later 1.0.x versions were pretty decent,
>
> Better than 2.0.4? More stable??

Depends on what your focus is. Dos support and IDE support _might_ be better in 1.0.x. RTL and compiler are way more bugfree in 2.0.x, which was the first reason why the 2.0.x branch was created anyway.

The 1.0.x compiler had a little internal flaw that made maintenance painful.

> (Laaca, any comments?) That would be hard
> to believe, but I have heard rumors. So what has been the most stable DOS
> version, anyways, 1.0.10?? (Ignoring IDE and debugging but re: actual
> compiling.)

Then the recent the better compilerwise. Dos support, don't know. Laaca probably knows that better than me.

> > IMHO in all other things FPC is better. The most crucial one being that
> > FPC (and maybe GPC) are the only ones with any future at all.
>
> Yes, obviously, and that is due to being GPL.

Not really. Being open, and having some community actually developing it. License is less important.

> (But of course C/C++ is
> king, for good or bad.)

Actually VB and VB.NET are afaik still the most used development tools.

> FASM is fully written in ASM, and
> it has many users.

Well, maybe some others like to move dunes in the Sahara using a spoon, but I find that neither very interesting let alone that I want to do that myself.

Note also that an assembler is fairly simple compared to a full production level compiler.

> people still use it. It's not as impossible to understand as some people
> pretend (at least, not at the bare instruction level ... other stuff may
> be a bit more complex).

It might not be impossible too to move dunes in the sahara using spoons. However it is neither useful nor productive :-)

Worse, at least moving dunes is simple, yet just a lot of work. The danger with asm is that when it really grows, you'll never get it somewhat bugfree.

Where the border lies is a matter of discipline and skill. But that comes at a productivity price. Moreover assembler is no good to me if for stability reasons I still have to code according a strict regime.

Rugxulo

Homepage

Usono,
12.04.2008, 00:43

@ marcov
 

FPC for DOS / FreeDOS

> > > IMHO in all other things FPC is better. The most crucial one being
> that
> > > FPC (and maybe GPC) are the only ones with any future at all.
> >
> > Yes, obviously, and that is due to being GPL.
>
> Not really. Being open, and having some community actually developing it.
> License is less important.

Okay, true, Win32 support would be nil if what I said was absolute. But then again, Win32 is everywhere, so getting support for that isn't surprising. (And FreeDOS is GPL and doesn't have as much FPC support, doh. So I guess I spoke too rashly.)

> > (But of course C/C++ is
> > king, for good or bad.)
>
> Actually VB and VB.NET are afaik still the most used development tools.

Not from my perspective, and certainly not in Linux or *BSD lands.

> > FASM is fully written in ASM,
> and
> > it has many users.
>
> Well, maybe some others like to move dunes in the Sahara using a spoon,
> but I find that neither very interesting let alone that I want to do that
> myself.

I think you overestimate the work involved. You can use external libraries just as in C. You don't have to reinvent the wheel for every project.

> Note also that an assembler is fairly simple compared to a full production
> level compiler.

An assembler could be simpler than a compiler, but it isn't always. I would certainly not consider NASM or FASM "simple." They are very powerful. Sure, a raw assembler that didn't do any macros, preprocessor tricks, etc., only raw instruction/opcode conversion would be simpler than trying to be POSIX and ANSI C compliant.

(Gah, stupid Windows, "Updated blah, do you want to reboot ... [counts down from 5 min. if I don't explicitly say NO!")

> It might not be impossible too to move dunes in the sahara using spoons.
> However it is neither useful nor productive :-)

It is indeed useful for speed and size reasons. C is not as fast as assembly and definitely not optimal. (Although C compilers ain't that bad anymore.) You do indeed have to tweak a lot if you want speed (which adds up).

> Worse, at least moving dunes is simple, yet just a lot of work. The danger
> with asm is that when it really grows, you'll never get it somewhat
> bugfree.

And anything ever is? No, but assembly is no worse. Heck, if you really want, just mix the two and have the "best of both worlds."

> Where the border lies is a matter of discipline and skill. But that comes
> at a productivity price. Moreover assembler is no good to me if for
> stability reasons I still have to code according a strict regime.

It has its uses. And some people prefer it. It all depends what you want and how much you know.

marcov

12.04.2008, 14:38

@ Rugxulo
 

FPC for DOS / FreeDOS

> > Actually VB and VB.NET are afaik still the most used development tools.
> Not from my perspective, and certainly not in Linux or *BSD lands.

True. But *nix overly biassed towards C (and its successor C++), due them evoluating with eachother. It is not a logical measuring rod for languages.

> I think you overestimate the work involved. You can use external libraries
> just as in C. You don't have to reinvent the wheel for every project.

It's more managing the size of the code that I'm worried about. That often is connected with the compiler checking the code.

One can of course do something to manage that (as is done in high tech environemnts were assembler is still used, e.g. to develop firmwares), but those methods are awfully labourous and costly. And at least they have a reason (if you have 20000000 products, saving two cents for a microchip with less flash is useful. But they don't run FreeDOS :-)

> > Note also that an assembler is fairly simple compared to a full
> production
> > level compiler.
>
> An assembler could be simpler than a compiler, but it isn't always.
> I would certainly not consider NASM or FASM "simple." They are very
> powerful.

Because they provide a minor macro system? So do most compilers, and most mature ones have a complete inline assembler built in. Often with lots of extensions to interface with the HLL (e.g. structure access)

> Sure, a raw assembler that didn't do any macros, preprocessor
> tricks, etc., only raw instruction/opcode conversion would be simpler than
> trying to be POSIX and ANSI C compliant.

I'm talking about compilers here. POSIX is an operating system minimal standard, and ansi C is a minimal standard for a compiler. They however says nothing about what an average compiler has.

> (Gah, stupid Windows, "Updated blah, do you want to reboot ... [counts
> down from 5 min. if I don't explicitly say NO!")

Worse, if you postpone it, it will come back. At least on XP. In Vista you can luckily shut it up.

> It is indeed useful for speed and size reasons. C is not as fast as
> assembly and definitely not optimal.

That is a very crude remark, which is btw false from most perspectives.

The average x86 assembler is systematically slower than the corresponding C code. Simply because it has assumptions to keep it managable, it was originally for an older version of the architecture (how much cmov do you use? And do you always test for, and then use SSE2 to move memory?)

Only a very small core of very highly optimized and frequently updated assembler is actually faster.

> (Although C compilers ain't
> that bad anymore.) You do indeed have to tweak a lot if you want
> speed (which adds up).

Well, the tweaking being optional is the point of a HLL in the first place.

> And anything ever is? No, but assembly is no worse. Heck, if you really
> want, just mix the two and have the "best of both worlds."

Nearly all runtime libraries have an assembler code somewhere for that reason, with primitives that greatly influence speed. (like memcpy, strscan etc). But that is something else than developing APPS in it.

> It has its uses. And some people prefer it. It all depends what you want
> and how much you know.

See my other mail. People can do what they want to. They can even wave away productivity comments with a "I like the puzzle side of assembler". It is the totally crooked justifications that madden me.

Rugxulo

Homepage

Usono,
12.04.2008, 20:52

@ marcov
 

FPC for DOS / FreeDOS

> True. But *nix overly biassed towards C (and its successor C++), due them
> evoluating with eachother. It is not a logical measuring rod for
> languages.

HLLs are not evil. They can be very useful. I like the fact that I can easily recompile Bzip2 for my Pentium 4 or even my old Pentium 1. But no, even that doesn't help as much as I'd like. I still think it's too slow (well, more noticeable on my old 486). Even on modern machines, shaving time can be a blessing. (You don't compile -O2 when debugging do you? It takes much much longer. And don't forget "make": if speed wasn't important, we'd recompile the entire project from scratch instead of only that which was modified.)

> > I think you overestimate the work involved. You can use external
> libraries
> > just as in C. You don't have to reinvent the wheel for every project.
>
> It's more managing the size of the code that I'm worried about. That often
> is connected with the compiler checking the code.

I wonder if you're drawing from your experience with VP here (complex, unmaintainable assembly).

> Because they provide a minor macro system? So do most compilers, and most
> mature ones have a complete inline assembler built in. Often with lots of
> extensions to interface with the HLL (e.g. structure access)

Neither GCC nor OpenWatcom support inline assembly intrinsically (they need an external program). Of course, I think Digital Mars and Intel do. And Intel can vectorize very well (supposedly), but it ain't free except for Linux. But still, somebody did they heavy lifting for you, which you can't always rely upon. Sometimes you have to do it yourself.

> > (Gah, stupid Windows, "Updated blah, do you want to reboot ... [counts
> > down from 5 min. if I don't explicitly say NO!")
>
> Worse, if you postpone it, it will come back. At least on XP. In Vista you
> can luckily shut it up.

I don't know who thought that was a good idea. "Security vulnerability that might cause you to accidentally reboot and lose data ... okay we fixed it, now we're gonna force you to reboot." (!!!)

> > It is indeed useful for speed and size reasons. C is not as fast as
> > assembly and definitely not optimal.
>
> That is a very crude remark, which is btw false from most perspectives.

If you are doing anything time-intensive (compiling, compressing, finding files), you will notice how slow things are. And a typical C compiler is just not always fast enough. If you don't mind or notice, that's your virtue, not mine. I don't have the patience to wait a thousand hours just to recompress something.

> The average x86 assembler is systematically slower than the corresponding
> C code. Simply because it has assumptions to keep it managable, it was
> originally for an older version of the architecture (how much cmov do you
> use? And do you always test for, and then use SSE2 to move memory?)
>
> Only a very small core of very highly optimized and frequently updated
> assembler is actually faster.

The PAQ8 series indeed sees a big increase in speed with the MMX or SSE2 assembly routines. And in case you haven't noticed my previous posts, 7ZA + HX is much faster than p7zip (DJGPP). Of course, the latter is also due to MSVC being slightly better than GCC at optimizations. But I also give Japheth a big dose of credit there, too. ;-)

> > (Although C compilers ain't
> > that bad anymore.) You do indeed have to tweak a lot if you want
> > speed (which adds up).
>
> Well, the tweaking being optional is the point of a HLL in the first
> place.

It would be better if the compiler authors documented what was optimized and what still needed work.

> > And anything ever is? No, but assembly is no worse. Heck, if you really
> > want, just mix the two and have the "best of both worlds."
>
> Nearly all runtime libraries have an assembler code somewhere for that
> reason, with primitives that greatly influence speed. (like memcpy,
> strscan etc). But that is something else than developing APPS in it.

Actually, from what I've seen, they tend to want to be self-hosting, so it's a layer upon layer upon layer.

> > It has its uses. And some people prefer it. It all depends what you
> want
> > and how much you know.
>
> See my other mail. People can do what they want to. They can even wave
> away productivity comments with a "I like the puzzle side of assembler".
> It is the totally crooked justifications that madden me.

Please list your preferred compilers so that we can see which ones optimize the best in your mind. I use DJGPP (GCC) and OpenWatcom among others, both of which are good but could still ideally be better. And FASM or Octasm (both written in themselves) are faster than YASM (C) which is faster than NASM (C).

marcov

13.04.2008, 23:36

@ Rugxulo
 

FPC for DOS / FreeDOS

> > True. But *nix overly biassed towards C (and its successor C++), due
> them
> > evoluating with eachother. It is not a logical measuring rod for
> > languages.
>
> HLLs are not evil. They can be very useful. I like the fact that I can
> easily recompile Bzip2 for my Pentium 4 or even my old Pentium 1. But no,
> even that doesn't help as much as I'd like. I still think it's too slow
> (well, more noticeable on my old 486). Even on modern machines, shaving
> time can be a blessing. (You don't compile -O2 when debugging do you? It
> takes much much longer. And don't forget "make": if speed wasn't
> important, we'd recompile the entire project from scratch instead of only
> that which was modified.)

Take e.g. delphi or other wirthian languages, and you'll see it otherwise. Slow compiling is a notorious C/C++ problem, not a HLL problem. It originates in the constant reinterpretation of headers and the fact that the compiler has to startup again for each bit of code (and yes, a compiler starts slower than an assembler. But the restarting for every file is not a requirement for any of them)

> > It's more managing the size of the code that I'm worried about.
>
> I wonder if you're drawing from your experience with VP here (complex,
> unmaintainable assembly).

No it's not. I only dove into the VP code for a day. (a bit more for the RTL parts)

It is more from when I was young and had some BBS software in assembler long ago, and spent months to tune a speedup a logfile analyser that the actual people that were using it turned out to schedule at night.

And in general, if projects get larger, you simply want to catch every bug where and when you can, _before_ you have to compile with debug info.

> Neither GCC nor OpenWatcom support inline assembly intrinsically (they
> need an external program).

Afaik gcc allows two kinds of inline assembler even. Directly inline (which is minimally parsed, and put through to the assembler), and a kind of register independant assembler (that is quite funky and scary btw) Don't know Watcom enough.

> But still, somebody did they heavy lifting for you, which you
> can't always rely upon. Sometimes you have to do it yourself.

So you don't use any software written by others on your 486?

> I don't know who thought that was a good idea. "Security vulnerability
> that might cause you to accidentally reboot and lose data ... okay we
> fixed it, now we're gonna force you to reboot." (!!!)

It's for use in corporations, where people till shortly never realised that computers used power, and left them on for weeks. No forced rebooting would kill vendors ability to push critical patches for outbreaks.

However while not 100% evil, they implemented it magnitudes too aggressive IMHO.

> If you are doing anything time-intensive (compiling, compressing, finding
> files), you will notice how slow things are.
> And a typical C compiler is
> just not always fast enough.

Well, maybe if you invested the same amount of time in the HLL code as in your handtuned assembler, than it would. The point is not the assembler, but the invested time to speed it up. Most engineers don't want to invest those magnitudes of time for something that is old hat in two years. If you finish such project for larger pieces of software, you can start rewriting it for newer processors before it is finished.

And don't forget that the whole GNU stack is designed for portability. I do agree they could speed up some very much used primitives (and compression is certainly one of them) on x86, and keep the C code for the rest (and afaik most compression already contains asm), but that is not a general rule that applies to all software.

> And in case you haven't noticed my previous posts, 7ZA + HX is
> much faster than p7zip (DJGPP).

And that proves exactly WHAT? THat you and japeth put in more time for some optimizing?

> It would be better if the compiler authors documented what was optimized
> and what still needed work.

Needs the analysis, and that is part of the optimization process too. If it was that easy, it was already done.

> Actually, from what I've seen, they tend to want to be self-hosting, so
> it's a layer upon layer upon layer.

There is always layering over assembler, if only to be able to abstract architecture.

> Please list your preferred compilers so that we can see which ones
> optimize the best in your mind. I use DJGPP (GCC) and OpenWatcom among
> others, both of which are good but could still ideally be better. And FASM
> or Octasm (both written in themselves) are faster than YASM (C) which is
> faster than NASM (C).

That only proves that you care more about speed than the YASM and NASM authors. Not that asm is superior

Damn, done it again. Now really, this is my last post on these endless discussions

Rugxulo

Homepage

Usono,
14.04.2008, 20:40

@ marcov
 

FPC for DOS / FreeDOS

> Take e.g. delphi or other wirthian languages, and you'll see it otherwise.
> Slow compiling is a notorious C/C++ problem, not a HLL problem. It
> originates in the constant reinterpretation of headers

It seems that OpenWatcom is faster to compile than GCC. And its output gets about 85% of GCC's speed. But yeah, I do wonder why no C compiler ever bundles all the .H files into one big database (since they aren't updated often).

> It is more from when I was young and had some BBS software in assembler
> long ago

If people still used 486s or non-superscalar architectures (e.g. VIA ?), they'd probably all be assembly programmers. But since clock speed (and cpu improvements) have all gotten better, it's less of an issue ... in most cases. Still, I wonder who is completely satisfied with 100% of their programs and thinks they don't need to be any faster. (Any compiler/compressor could be ideally faster.)

> And in general, if projects get larger, you simply want to catch every bug
> where and when you can, _before_ you have to compile with debug info.

Uh, I wouldn't call C++ the easiest thing to debug (but to be fair, I don't know or use it).

> So you don't use any software written by others on your 486?

99% of the software I (rarely) use on my 486 is written by someone else. I'm not one of the geniuses that writes their own OS and all the apps (although they do exist). Just trust me, in case you've forgotten, a 486 is slow! Therefore, you are bound to say, "Man, this is annoying, couldn't it be faster?" Even using a P166 is much much less annoying (approx. 10x faster). Heck, even using my P4 w/ XP makes me irritated sometimes.

> Well, maybe if you invested the same amount of time in the HLL code as in
> your handtuned assembler, than it would.

No. The compiler is inherently limited, pure assembly is not. It doesn't mean you can't optimize at the HLL level (search Google for C optimizations), but their generic optimizations don't always pan out.

> The point is not the assembler,
> but the invested time to speed it up. Most engineers don't want to invest
> those magnitudes of time for something that is old hat in two years.

The problem here is that you're assuming everyone has newest, latest Core 2 "Penryn" processors with quad cores, etc. Sure, those are very fast (supposedly, never tried myself). But 95% of the world doesn't use those yet. So you can code blissfully ignorant of the whole world or you can spend time working towards that which is slower but still functional.

BTW, assembly doesn't usually get slower with newer processor upgrades. Sure, the advantage can disappear (see HA C-- rewrite), but the person writing it had a tangible reason to: his own personal computer. If you don't have a 486, you won't write for one. It's personal when you do, though, because you want it to run faster.

> If you finish such project for larger pieces of software, you can start
> rewriting it for newer processors before it is finished.

Sure, if you have infinite amounts of money and are willing to upgrade your cpu. Most people don't do that, however.

> I do agree they could speed up some very much used primitives (and
> compression is certainly one of them) on x86, and keep the C code for the
> rest (and afaik most compression already contains asm)

Only the best compressors have asm speedups. The rest just wimp out and say, "Good enough" (and that's probably fair in some cases). But just FYI, there are reasons why Zip/Gzip (Deflate) and BZip2 (BWT) are still popular despite much better compression available: low memory footprint and fast.

> > And in case you haven't noticed my previous posts, 7ZA + HX is
> > much faster than p7zip (DJGPP).
>
> And that proves exactly WHAT? THat you and japeth put in more time for
> some optimizing?

p7zip is pure C/C++ (last I checked) while HX is definitely not. (And I had no involvement in writing any code for that.) You cannot expect pure HLL to always be faster than all pure assembly. In fact, it IS assembly (once translated).

> FASM or Octasm (both written in themselves) are faster than YASM (C)
> which is faster than NASM (C).
>
> That only proves that you care more about speed than the YASM and NASM
> authors. Not that asm is superior

The YASM and NASM I use are DOS/DJGPP compiles, and DJGPP is not the best optimizing compiler (although fairly good). So that makes a difference too.

"Bare" Assembly (language) is of course superior because without it you can't run anything! Even HLL all becomes assembly eventually. It doesn't mean you should write everything in pure assembly. A skilled coder who knows what he/she is doing can outmatch a braindead compiler any day. But HLLs have advantages too, just not 100% of the time (which I hope you aren't advocating).

marcov

14.04.2008, 21:59

@ Rugxulo
 

FPC for DOS / FreeDOS

> It seems that OpenWatcom is faster to compile than GCC. And its
> output gets about 85% of GCC's speed. But yeah, I do wonder why no C
> compiler ever bundles all the .H files into one big database (since they
> aren't updated often).

.h files interpretation are bound to all preprocessor symbols getting in. two #includes of the same header can have totally different preprocessor state (and thus preprocessed code) as result.

> > It is more from when I was young and had some BBS software in assembler
> > long ago
>
> If people still used 486s or non-superscalar architectures (e.g. VIA ?),
> they'd probably all be assembly programmers.

I doubt that. It was already waning in 486 times.

> But since clock speed (and cpu improvements) have all gotten better, it's less of an issue ... in
> most cases. Still, I wonder who is completely satisfied with 100% of their
> programs and thinks they don't need to be any faster. (Any
> compiler/compressor could be ideally faster.)

Any non trivial assembler code too. So there is no difference there _if_ you are willing to invest disproportio

> > Well, maybe if you invested the same amount of time in the HLL code as
> in
> > your handtuned assembler, than it would.
>
> No. The compiler is inherently limited, pure assembly is not. It doesn't
> mean you can't optimize at the HLL level (search Google for C
> optimizations), but their generic optimizations don't always pan out.

I'm not the kind that says throw your fastest code, and the compiler will figure it out. But your statement is the other extreme.

> > The point is not the assembler,
> > but the invested time to speed it up. Most engineers don't want to
> invest
> > those magnitudes of time for something that is old hat in two years.
>
> The problem here is that you're assuming everyone has newest, latest Core
> 2 "Penryn" processors with quad cores, etc.

You assume what I'm presumed to assume, how odd. You don't know me.

Note that I turned away from assembler when I had a 486 DX2/80, except for really reusable routines. I turned away from that with my

> BTW, assembly doesn't usually get slower with newer processor upgrades.

Yes it does. (based on a per cycle basis). E.g. pre pentium assembler doesn't take register stalls into account. Not taking prefetch into account can hurt, similar with branch prediction (the branch taken is a simple heuristic, if your handcoded assembler does otherwise, it hurts).

For an assembler programmer you don't seem to know much about this !?!?!?!

> > If you finish such project for larger pieces of software, you can start
> > rewriting it for newer processors before it is finished.
>
> Sure, if you have infinite amounts of money and are willing to upgrade
> your cpu. Most people don't do that, however.

I hardly consider a CPU of Eur 40 "infinite amounts of money". Let alone, that I had a 486 when I stopped being obsessive about assembler (admitted, a fairly heavy one, at least for that generation)

> > I do agree they could speed up some very much used primitives (and
> > compression is certainly one of them) on x86, and keep the C code for
> the
> > rest (and afaik most compression already contains asm)
>
> Only the best compressors have asm speedups. The rest just wimp out and
> say, "Good enough" (and that's probably fair in some cases). But just FYI,
> there are reasons why Zip/Gzip (Deflate) and BZip2 (BWT) are still
> popular despite much better compression available: low memory footprint
> and fast.

I know. I used paq over 10 years ago, and squeezed it into my 16MB of ram. Decoder availability is another one btw.

> The YASM and NASM I use are DOS/DJGPP compiles, and DJGPP is not the best
> optimizing compiler (although fairly good). So that makes a difference
> too.

I'm no real wizard, but afaik gcc optimizes fairly decently with generic optimizations, but largely misses a decent CPU dependant peephole optimizer.

Still, I'd expect an assembler to be speed bound on a different level than asm optimization.

> "Bare" Assembly (language) is of course superior because without it you
> can't run anything!

Well, that is because you have a 486. With the introduction of the P6 cores (PPRo+) that changed.

Rugxulo

Homepage

Usono,
15.04.2008, 01:45

@ marcov
 

FPC for DOS / FreeDOS

> .h files interpretation are bound to all preprocessor symbols getting in.
> two #includes of the same header can have totally different preprocessor
> state (and thus preprocessed code) as result.

So it's impossible without breaking things? (I dunno, honestly.)

> > If people still used 486s or non-superscalar architectures (e.g. VIA
> ?),
> > they'd probably all be assembly programmers.
>
> I doubt that. It was already waning in 486 times.

Kinda silly to buy a 10 x faster computer and then slow it down 10 x by more useless abstraction. :-/

> Note that I turned away from assembler when I had a 486 DX2/80, except for
> really reusable routines. I turned away from that with my

From what I've (barely) learned recently: 486 was pipelined (vs. 386 partially pipelined) while Pentium was superscalar (two pipelines), and later models even moreso. The Pentium even had a pipelined FPU. And yet, there is plenty you can do to help a 486, 586, 686, etc. At least the Quake authors thought so (which used DJGPP for the compiler, BTW).

> > BTW, assembly doesn't usually get slower with newer processor upgrades.
>
> Yes it does. (based on a per cycle basis). E.g. pre pentium assembler
> doesn't take register stalls into account.

486s do indeed suffer from AGIs, just not as bad as Pentiums. And since acceptable 486 code will "by default" run much much faster (usually twice as fast or more), such penalties are easily offset. But yes, it's inherently hard to optimize, period! But it's still worthwhile, IMO.

> Not taking prefetch into
> account can hurt, similar with branch prediction (the branch taken is a
> simple heuristic, if your handcoded assembler does otherwise, it hurts).
>
> For an assembler programmer you don't seem to know much about this
> !?!?!?!

I don't understand it fully (who can?), it's very complex. Plus, it's hard to prove it easily, so I tend to still look for easy answers. I know this, for sure: 386, 486, 586, 686 all require different measures of optimization. You can somewhat optimize for all of them, though.

> > Sure, if you have infinite amounts of money and are willing to upgrade
> > your cpu. Most people don't do that, however.
>
> I hardly consider a CPU of Eur 40 "infinite amounts of money". Let alone,
> that I had a 486 when I stopped being obsessive about assembler (admitted,
> a fairly heavy one, at least for that generation)

The "infinite" here is regarding the continual process of upgrading your processor every so often when a faster one is released (assuming your motherboard is compatible, ugh). And yes, a 486 DX2/80 is quite fast. But even the lowliest Pentium is faster than that. (Of course, early Pentiums had the FDIV bug, plus they generate / require more heat / energy).

> > The YASM and NASM I use are DOS/DJGPP compiles, and DJGPP is not the
> best
> > optimizing compiler (although fairly good). So that makes a difference
> > too.
>
> I'm no real wizard, but afaik gcc optimizes fairly decently with generic
> optimizations, but largely misses a decent CPU dependant peephole
> optimizer.

IMO, they could use tons more help regarding their 586 or less optimizations. However, I'm unlikely to be of use in that regard (at least, not yet). :-/

> > "Bare" Assembly (language) is of course superior because without it you
> > can't run anything!
>
> Well, that is because you have a 486. With the introduction of the P6
> cores (PPRo+) that changed.

Early PPros actually ran 16-bit code slower than later-model Pentiums! (And yes, I have other computers too, including this P4, heh). They all act differently, but for sure they should mostly run things as fast, if not faster.

marcov

15.04.2008, 16:11

@ Rugxulo
 

FPC for DOS / FreeDOS

> > two #includes of the same header can have totally different
> preprocessor
> > state (and thus preprocessed code) as result.
>
> So it's impossible without breaking things? (I dunno, honestly.)

No, just difficult and a lot of work. Since the "precompiled" header possibly depends on the entire context of the preprocessor state until then, the typical way is to reduce that context to the values that really matter for the header and store that. So then you can "simply" compare and decide to reinterpret or use precompiled.

Moreover, the state is not entirely random (certain inclusion orders are pretty constant), so a history how the context is built can also help.

I only know this from theory, how C compilers that do (gcc doesn't) do this, I don't know. ALso since most are proprietary.

> Kinda silly to buy a 10 x faster computer and then slow it down 10 x by
> more useless abstraction. :-/

If that was the factor, yes that would be said. Luckily it isn't

> From what I've (barely) learned recently: 486 was pipelined (vs. 386
> partially pipelined)

Yes, but only in instruction fetching. IOW the it still couldn't get faster than one instruction/clock. The 386 had an even simpler prefetch I believe.

But note the main thing here: my leaving assembler as main programming language, dates from 486 times (and partially earlier), not from Core2 times as the advocatist here try to insinuate

> while Pentium was superscalar (two pipelines),

Pentium was superscalar but very limited. (1 1/2 pipeline, the v one was very limited).

> and
> later models even moreso. The Pentium even had a pipelined FPU. And yet,
> there is plenty you can do to help a 486, 586, 686, etc. At least
> the Quake authors thought so
> (which used DJGPP for the compiler, BTW).

Actually it is kind of funny that I'm having this discussion. Last saturday I junked all old PCs. The slowest PC now is an Athlon XP2000+.

And yes, I remember the quake situation pretty well, since I was bitten badly by it. I bought not a pentium but a Cyrix (a P166+ for 60% of the price of a P133). Due to not having the same fpu/cpu mix tradeoffs, Quake was painful.

Of course when I used the difference in money to buy a voodoo card, it was maybe still a better deal :)

I do still have several slower computers, but they are not intel/PCs.

> 486s do indeed suffer from AGIs, just not as bad as Pentiums. And since
> acceptable 486 code will "by default" run much much faster (usually twice
> as fast or more), such penalties are easily offset. But yes, it's
> inherently hard to optimize, period! But it's still worthwhile, IMO.

On faster computers I meant. A misprediction gets more expensive with speedier comptuters.

> > For an assembler programmer you don't seem to know much about this
> > !?!?!?!
>
> I don't understand it fully (who can?), it's very complex. Plus, it's hard
> to prove it easily, so I tend to still look for easy answers. I know this,
> for sure: 386, 486, 586, 686 all require different measures of
> optimization. You can somewhat optimize for all of them, though.

Using the timestamp counters, one can benchmark sequences of instructions pretty well. But it is indeed hard. Trick is that not all code is important in that respect, and for the code that matters (like the typical RTL primitives) it is worth it. See e.g. the fastcode project.

> The "infinite" here is regarding the continual process of upgrading your
> processor every so often when a faster one is released (assuming your
> motherboard is compatible, ugh).

Even then. 10 years * (E100 for upgrade kit every 3-4 years) = E200-300. Not trivial, but certainly not infinite.

> IMO, they could use tons more help regarding their 586 or less
> optimizations. However, I'm unlikely to be of use in that regard (at
> least, not yet). :-/

That's because the people that are still interested in it, are wasting their time on writing anything in assembler :-) When they grow over it, they pursue other interests. Somehow that is tragic.

> Early PPros actually ran 16-bit code slower than later-model Pentiums!

No. All PPro's did. Only the P-II resolved this.

> (And yes, I have other computers too, including this P4, heh). They all
> act differently, but for sure they should mostly run things as
> fast, if not faster.

They all did, with exception of the upgrade from the C=64 to my first PC, a 386. While the 386 had 20 times faster clock, the software was way slower, I can clearly remember think PC games really sucked, despite the PCs larger power in every aspect (CPU, HD, memory, VGA, SB)

The main difference was that all C=64s were the same, and all software hugely took advantage of that. The chaos on the PC side really costs performance.

Rugxulo

Homepage

Usono,
16.04.2008, 02:23

@ marcov
 

FPC for DOS / FreeDOS

> Yes, but only in instruction fetching. IOW the it still couldn't get
> faster than one instruction/clock. The 386 had an even simpler prefetch

386 at best only got 1 instruction per 2 clocks. So, a 386 at the same clock speed as 486 was half as fast (right?).

> my leaving assembler as main programming language, dates from 486 times
> not from Core2 times as insinuated

It's a valuable skill to optimize for size or speed. It's NOT a skill to just say, "Bah, buy a new computer."

> > while Pentium was superscalar (two pipelines),
>
> but very limited. (1 1/2 pipeline, v was very limited).

But "1 1/2" is better than only 1. Trying to compile Allegro on a 486 was painfully slow. ;-) I would easily recommend to 7-ZIP the lib these days instead of subjecting anyone to that. :-P

> > and
> > later models even moreso. The Pentium even had a pipelined FPU. And
> yet,
> > there is plenty you can do to help. At least
> > the Quake authors thought
> so
> > (which used DJGPP for the compiler, BTW).
>
> Actually it is kind of funny that I'm having this discussion. Last
> saturday I junked all old PCs. The slowest PC now is an Athlon XP2000+.

Someone with multiple "old" cpus could always do clusters with things like dynebolic. I might try to convince my bro to try that on all his recycled cpus. ;-)

> I bought not a pentium but a Cyrix (a P166+ for 60% of the
> price of a P133). Due to not having the same fpu/cpu mix tradeoffs, Quake
> was painful.

You can always recompile it these days and try again ;-) (well, if you hadn't junked the computer). Heck, last I tried, Quake wouldn't even run under XP (no surprise there). Of course, DJGPP circa 1996 wasn't quite as compatible as it is now.

But don't worry, even if that doesn't run, there's always Wolf 3D, Doom (or FreeDoom), Chasm: The Rift, Ken's Labyrinth, etc. ;-)

> Of course when I used the difference in money to buy a voodoo card, it was
> maybe still a better deal :)

What OS(es) did you run on that Cyrix machine?

> I do still have several slower computers, but they are not intel/PCs.

Atari 800XL? VIC20? Apple II?

> > I don't understand it fully (who can?), it's very complex.
>
> Using timestamp counters, one can benchmark pretty well. But it
> is hard.

Luckily, even JEMM386 emulates such for us in ring 3. ;-)

> Trick is that not all code is
> important in that respect, and for the code that matters (like the typical
> RTL primitives) it is worth it. See e.g. the fastcode project.

Have you ever heard of liboil? (seems mostly GCC-centric)

> Even then. 10 years * (E100 for upgrade kit every 3-4 years) = E200-300.
> Not trivial, but certainly not infinite.

Sorry if I feel silly upgrading only because it's slightly faster. If an "old" cpu still works and you can be productive on it, why throw it out? Plus, it can be a good learning experience. (link)

> > IMO, they could use tons more help regarding their 586 or less
> > optimizations. However, I'm unlikely to be of use in that regard (at
> > least, not yet). :-/
>
> That's because the people that are still interested in it, are wasting
> their time on writing anything in assembler :-) When they grow over it,
> they pursue other interests. Somehow that is tragic.

GCC is not exactly what I'd call easy to understand (internally). Besides, I have yet to really find any decent 486 optimization tips that truly impressed me (but it's supposedly very sensitive to alignment, way more than 386 or 586). Even VIA's cpus (except upcoming Isaiah) are all in-order execution (no superscalar), and that's why Debian/Ubuntu is only compiled 486+ by default (last I heard). And they use less power too.

> > (I have other computers too, including this P4). They all
> > act differently, but they should mostly run things as
> > fast, if not faster.
>
> They all did, with exception of the upgrade from the C=64 to my first PC,
> a 386. While the 386 had 20 times faster clock, the software was way
> slower, I remember think PC games really sucked, despite the
> PCs larger power (CPU, HD, memory, VGA, SB)

A fast 386 was supposedly able to emulate an Atari800 (1.79 Mhz) at full speed. Supposedly, you need about 10x the power to emulate anything. And BTW, that emulator (PC Xformer) was written (at that time) in assembly for DOS for maximum speed. :-P

BTW, I also heard that the C64 (honestly, I never owned either) did sound mixing in hardware unlike the PC. So that alone would slow it down a bit. Also, PCs were more expensive, so they didn't catch on as well at the time.

Japheth

Homepage

Germany (South),
12.04.2008, 07:33
(edited by Japheth, 12.04.2008, 08:03)

@ marcov
 

FPC for DOS / FreeDOS

> It might not be impossible too to move dunes in the sahara using spoons.
> However it is neither useful nor productive :-)

> Worse, at least moving dunes is simple, yet just a lot of work. The danger
> with asm is that when it really grows, you'll never get it somewhat
> bugfree.

Usually those "ASM vs HLL" (or "DOS vs Windows/Linux") discussions are comparing apples and pears, because on the one side there are ASM/DOS fanatics and on the other side there are "victims of popular myths", believing that "the majority cannot be wrong". You're obviously on the latter side, but be warned: you're in a DOS forum, where some participants might believe that "the majority is always wrong" (or, more popular, that "one million flies might be wrong when sitting on a pile of sh*t").

I'm experienced in both ASM and C(++), so to choose ASM intentionally as implementation language might sound absolutely unreasonable for you - size and speed usually don't matter.

However, I'm doing just this, and I'm neither a masochist nor unreasonable. So there must be something wrong with your "spoon in the desert" picture.

There are some advantages of HLLs compared to ASM, portability is the most important one, but very often portability isn't an issue at all, and to a large degree portability is - as far as HLLs are concerned - a "myth" with few relation to reality.

Besides the portability aspect, if you compare a Win32 program written in (M)ASM and a Win32 program written in C, there is almost "no difference" in complexity. The ASM version probably will need slightly more lines for the same functionality, that's all. OTOH, constructs like

*((dir_node *)dir->sym)->e.constinfo->is_this_true = FALSE;

fortunately aren't possibly in ASM and therefore ASM hasn't the bad "write-only" image like some HLLs.

Even more, ASM programmers tend to add comments in their source, a habit which many C (and Pascal?) programmers seem to ignore - possibly because they're believing their language is "self-documenting"?

---
MS-DOS forever!

marcov

12.04.2008, 13:27

@ Japheth
 

FPC for DOS / FreeDOS

> You're obviously on the
> latter side,

That's a horribly biassed assumption. Everybody that doesn't agree with you, of course can't think for himself, and just goes with the flow.

And of course I'm sparring a bit for fun, and exaggerating the parallels (the desert example). But the point of this is to separate actual, valid reasons from "we are a little minority that is always right" principle. The smaller the minority, the more extreme the opinions.

> but be warned: you're in a DOS forum, where some participants
> might believe that "the majority is always wrong" (or, more popular, that
> "one million flies might be wrong when sitting on a pile of sh*t").

Well, stuff me in a third category, since I have programmed assembler for a living. And still am in contact with others that do. Some even fulltime, and not on the side like I did. (our microcontroller is only for support of the PC).

However if you ask them, they have very sane reasons, and can back this up by actual facts. _ALL_ of them have the same application also in a HLL btw.

> so to choose ASM intentionally as
> implementation language might sound absolutely unreasonable for you - size
> and speed usually don't matter.

Correct. However what worries me more in these kinds of discussions is the failure to come up with any sane reasons. Everybody talks about "embedded", but if you ask further, it turns out they use yesteryears PC, which were never embedded in the first place.

> However, I'm doing just this, and I'm neither a masochist nor
> unreasonable.

Well, a lot of loons don't realise that they are. But ok, let's have your reasons. Even "it is just a hobby, and I like the puzzle aspect" is a better reason that what I have heard thus far.

> There are some advantages of HLLs compared to ASM, portability is the most
> important one, but very often portability isn't an issue at all, and to a
> large degree portability is - as far as HLLs are concerned - a "myth" with
> few relation to reality.

Well, at least we agree on that. The portability _is_ important, but not blackwhite over the border of a HLL/asm border.

> *((dir_node *)dir->sym)->e.constinfo->is_this_true = FALSE;
>
> fortunately aren't possibly in ASM and therefore ASM hasn't the bad
> "write-only" image like some HLLs.

Well, that is for me one of the fundamental reasons. And then I don't mean the above mess per se, but the ability to abstract a datatype from its memory layout, and in general the syntax check of the compiler.

I know, you can give fields of records an offset, you can workaround some stuff with macro systems that systematically get more complex (though that is not strictly assembler anymore), and in fact C's history is much in the same space (some of the C predecessors were closer to a complex macro assembler than to C). You can even do a bit of a kind of "assembler lint" to detect often made flaws.

But it remain workarounds, and the processing program (the assembler/compiler) has way less information to work with (and to check things) in the assembler case then in the compiler case. And that already assumes you plugged the other defects.

Moreover, C is known as the assembler of HLLs. So you comparing already to rock bottom.

> Even more, ASM programmers tend to add comments in their source, a habit
> which many C (and Pascal?) programmers seem to ignore - possibly because
> they're believing their language is "self-documenting"?

Depends. In general, people still doing asm are doing that for a very specifical reason, and they (or their boss) are typically not paid directly in proportion to their output, like e.g. a contract programmer.

In those kinds of niches, people can do pretty much anything they want, not hampered by any competition (since not assembler, but their often long experience with the company is the main factor).

Japheth

Homepage

Germany (South),
12.04.2008, 16:18

@ marcov
 

FPC for DOS / FreeDOS

> Correct. However what worries me more in these kinds of discussions is the
> failure to come up with any sane reasons. Everybody talks about "embedded",
> but if you ask further, it turns out they use yesteryears PC, which were
> never embedded in the first place.

I didn't talk about "embedded" and I mentioned the reason in my previous post. To repeat myself: the productivity advantage achieved by C compared to ASM is not significant if portability isn't an issue.

> the processing program (the
> assembler/compiler) has way less information to work with (and to check
> things) in the assembler case then in the compiler case.

No.

> Moreover, C is known as the assembler of HLLs. So you comparing already to
> rock bottom.

Yes. But what's the point? Do you believe that the chain ASM -> C -> C++ -> C# does also - quite naturally - suggest a productivity boost for each level?

---
MS-DOS forever!

marcov

13.04.2008, 02:54

@ Japheth
 

FPC for DOS / FreeDOS

> I didn't talk about "embedded" and I mentioned the reason in my previous
> post. To repeat myself: the productivity advantage achieved by C compared
> to ASM is not significant if portability isn't an issue.

To repeat myself: that is not my experience.

> > the processing program (the
> > assembler/compiler) has way less information to work with (and to check
> > things) in the assembler case then in the compiler case.
>
> No.

Your lack of counter arguments is disturbing.

> > Moreover, C is known as the assembler of HLLs. So you comparing already
> to
> > rock bottom.
>
> Yes. But what's the point? Do you believe that the chain ASM -> C -> C++
> -> C# does also - quite naturally - suggest a productivity boost for each
> level?

Up to C++, yes, though not automatically. C# doesn't belong in that sequence IMHO. It is often put there for marketing, not technical reasons

Japheth

Homepage

Germany (South),
13.04.2008, 09:50

@ marcov
 

FPC for DOS / FreeDOS

> To repeat myself: that is not my experience.

I got you already and there's no need to repeat yourself. OTOH, my repetition was forced by your claim that I failed "to come up with any sane reasons".

IMO it's absolutely "sane" if I prefer to decide what's to use based on MY experience and it would be somewhat "insane" if I take YOURs instead.

> > > the processing program (the
> > > assembler/compiler) has way less information to work with (and to
> check
> > > things) in the assembler case then in the compiler case.
> >
> > No.
>
> Your lack of counter arguments is disturbing.

If you are a "contributor" to FPC and since you also claimed to have written some code in assembly you should be aware that there is no real assembly language "standard" and there's also no clear border where a thing is no longer to be called an assembler and "starts" to be a compiler.

In other words, some assembler implementations deliberately "forget" the types of variables (NASM) and others have rather strong type checking. And nothing prevents you to write an assembler which - optionally - also will understand CLASS, VIRTUAL, FRIEND, ... and thus support OO directly.

In short, your claim is nonsense "by design" because basically there is no limit as to what degree an assembler is to collect and remember "type" information.

> Up to C++, yes, though not automatically. C# doesn't belong in that
> sequence IMHO. It is often put there for marketing, not technical reasons

I like C# and think it is perfectly valid to put it into this line. JIT compiling is regarded as "progress" by the "majority", so why not?

---
MS-DOS forever!

marcov

13.04.2008, 23:17

@ Japheth
 

FPC for DOS / FreeDOS

> IMO it's absolutely "sane" if I prefer to decide what's to use based on MY
> experience and it would be somewhat "insane" if I take YOURs instead.

Well, it is a pity if this turns in a yes/no contest, so just leave it, and focus on the other subthreads.

(...)
> you should be aware that there is no real
> assembly language "standard" and there's also no clear border where a
> thing is no longer to be called an assembler and "starts" to be a
> compiler.

First, I'm not a compiler expert. However I'm not the first naieve newbie either.

That border is arbitrary and vague indeed, but that doesn't mean there are no differences. The usual is "compile from higher to lower language".

Wikipedia seems to define it that way, and I've seen these discussions get out of hand because the definition of "high" and "low" is discussed ad infinitum.

Stronger, the other way around ("low" to "high") is still a compiler, since it is more than a bit of rewriting, analysis is required. (like very clean C to Java to run it on a JVM on a phone)

And a macro assembler that "compiles" to assembler also fits the definition (and the poor slobs that upload microcode patches to the CPU might argue that about assembler to microcode too). And strictly one could even transcode a lower language to a higher language.

So in general I would conclude it in one word: abstractions built into the system, rather than duck taped on via e.g. preprocessors and macro's. So abstractions in data (data types), program build up (information hiding, separate compilation), flow constructs ("proper" functions, loops etc). Abstraction from the machine too (it was actually one of the main reasons why HLLs developed), but it is not required. Personally I'd value abstract concepts for data, code and program organisation more.

The exact border is still vague, and one can bicker over that for decades, so I won't try, specially with exhibit A: the C language that does way too much with a preprocessor lurking around the corner.

But while one can bicker about the detail level as much as one wants, I think the principle division is fairly clear.

> In other words, some assembler implementations deliberately "forget" the
> types of variables (NASM) and others have rather strong type checking. And
> nothing prevents you to write an assembler which - optionally - also will
> understand CLASS, VIRTUAL, FRIEND, ... and thus support OO directly.

But is it then still an assembler at all? Maybe it then has become a compiler for a language with a bit of assembler here and there?

If you think about it that way, when does the macro handling become a language in its own right, and can the bits of asm in between seen as "inline assembler"?

Such thoughts were initially the motivation for the above definition (without "high" and "low"). Apparantly the difference is when it stops being a macro language and becomes a language.

And of course that line is also blurry again, but since macro (and assembling) originate in simple substitutive behaviour or simple transformation, it is logical to define it a language if it starts to deal with abstractions allowing more involved transformations.

> In short, your claim is nonsense "by design" because basically there is no
> limit as to what degree an assembler is to collect and remember "type"
> information.

Can it handle arbitrary types? Will it really allow abstractions and check/transform them or does it just substitute a few numbers for your identifiers?

There is a hole there because at some level a compiler also substitutes numbers (the base principle is that it can

Steve

Homepage E-mail

US,
14.04.2008, 05:51

@ marcov
 

Compiler debate

> The usual is "compile from higher to lower language".

Good description. But the original purpose was to translate, from a more-human-readable and/because more abstract structure, to more-machine-readable, closer-to-the-metal structure, finally to what used to be called machine language, now commonly called microcode.

> Wikipedia seems to define it that way, and I've seen these discussions get
> out of hand because the definition of "high" and "low" is discussed ad
> infinitum.

High=More abstract. Low=Closer to machine language. These are relative not absolute positions, i.e., meaningless in isolation.

> Stronger, the other way around ("low" to "high") is still a compiler,
> since it is more than a bit of rewriting, analysis is required.

Not a compiler, but the opposite, hammering lower level code into a subset of a larger and more complex set of instructions.

> And a macro assembler that "compiles" to assembler also fits the
> definition (and the poor slobs that upload microcode patches to the CPU
> might argue that about assembler to microcode too).

The poor slobs are only old-fashioned. Time was, when assembly was the HLL, and then Fortran was the miracle that allowed ordinary rocket scientists to get some real work done.

> And strictly one could even transcode a lower language to a higher language.

It's been going on since software was invented. In fact, it could be the definition of software, including languages, OSes...

> But is it then still an assembler at all? Maybe it then has become a
> compiler for a language with a bit of assembler here and there?

Assembler and compiler are zones in a continuum, not clearly marked strata.

> If you think about it that way, when does the macro handling become a
> language in its own right, and can the bits of asm in between seen as
> "inline assembler"?

> Such thoughts were initially the motivation for the above definition
> (without "high" and "low"). Apparantly the difference is when it stops
> being a macro language and becomes a language.

There's an old joke in linguistics: A language is a dialect with an army and a navy.

Japheth

Homepage

Germany (South),
14.04.2008, 08:49

@ marcov
 

FPC for DOS / FreeDOS

> > In short, your claim is nonsense "by design" because basically there is
> no
> > limit as to what degree an assembler is to collect and remember "type"
> > information.
>
> Can it handle arbitrary types? Will it really allow abstractions and
> check/transform them or does it just substitute a few numbers for your
> identifiers?

What is "it"? I can mention MASM, which I probably know better than anyone else. It allows to define structures, bitfields and aliases of simple types. It lacks enums. Also, it allows to define function pointers and offers parameter type checking for any kind of function calls. It doesn't go as far as in C/C++, and there's no check for a function's return value, but it's a good - and for most cases "sufficient" - level.

---
MS-DOS forever!

marcov

13.04.2008, 23:18

@ Japheth
 

FPC for DOS / FreeDOS

(post splitted due to length)

> > Up to C++, yes, though not automatically. C# doesn't belong in that
> > sequence IMHO. It is often put there for marketing, not technical
> reasons
>
> I like C# and think it is perfectly valid to put it into this line. JIT
> compiling is regarded as "progress" by the "majority", so why not?

Better or worse doesn't define a lineage. Crudely said: because somebody works in your old job, and wears a similar hat that you used to wear there, because he thinks he should, doesn't make him your child. Not even if the whole office screams so.

So in short, language wise there is simply no real family bond between C/C++ and C#, except for some minor syntax (like operator names, curly braces and form of _some_ declarations). However most of those actually work (are define) differently than C/C++ does. The resemblance (and the resulting advocacy) was the marketing reasons I refered to. It was worthwhile for Java and C# to paint an upgrade path with these similarities, but that doesn't mean the languages are similar. Oh, JITting and C# are not 100% correlated. See e.g. NGEN.

----

Anyway these threads have gotten out of hand since apparantly people here (including you) don't want to discuss with an open mind, but cling to any hair to defend their predefined position.

That's not productive, so this will be my last post not directly relating to the compilers that I know _something about_ (VP/FPC), or technical dos details. I like a bit of discussing, but these long posts cost a lot of work, and I don't have the feeling anybody is really reading them.

Steve

Homepage E-mail

US,
14.04.2008, 06:00

@ marcov
 

Compiler debate

> Better or worse doesn't define a lineage. Crudely said: because somebody
> works in your old job, and wears a similar hat that you used to wear
> there, because he thinks he should, doesn't make him your child. Not even
> if the whole office screams so.

Not the child, but for some purposes a functional equivalent. In the case of C#, perhaps with the gears better oiled.

> ... this will be my last post not directly relating
> to the compilers that I know _something about_ (VP/FPC), or technical dos
> details. I like a bit of discussing, but these long posts cost a lot of
> work, and I don't have the feeling anybody is really reading them.

I'm reading.

Japheth

Homepage

Germany (South),
14.04.2008, 08:21

@ marcov
 

FPC for DOS / FreeDOS

> Anyway these threads have gotten out of hand since apparantly people here
> (including you) don't want to discuss with an open mind, but cling to any
> hair to defend their predefined position.

What's your problem? After all, it was you who replied to Rugxulo's innocent mentioning of the FASM community with the "spoon in the desert" picture, thus suggesting that anyone using ASM nowadays is unreasonable (insane?). If someone dares to express some doubts about such a rigorous view, does that prove that he/she has no "open mind"? Possibly in your world, but not in general IMO.

---
MS-DOS forever!

marcov

14.04.2008, 11:12

@ Japheth
 

FPC for DOS / FreeDOS

> > Anyway these threads have gotten out of hand since apparantly people
> here
> > (including you) don't want to discuss with an open mind, but cling to
> any
> > hair to defend their predefined position.
>
> What's your problem? After all, it was you who replied to Rugxulo's
> innocent mentioning of the FASM community with the "spoon in the desert"
> picture, thus suggesting that anyone using ASM nowadays is unreasonable
> (insane?). If someone dares to express some doubts about such a rigorous
> view, does that prove that he/she has no "open mind"? Possibly in your
> world, but not in general IMO.

One can discuss if it is my rigidness, or yours (plural), but point remains that we are not getting closer together in these discussions, which makes them a bit too pointless.

Japheth

Homepage

Germany (South),
14.04.2008, 12:52

@ marcov
 

FPC for DOS / FreeDOS

> One can discuss if it is my rigidness, or yours (plural), but point
> remains that we are not getting closer together in these discussions,
> which makes them a bit too pointless.

What do you expect from discussions? They have the prefix "dis", not "con", so their purpose isn't at all to "... getting closer together". :-D

---
MS-DOS forever!

Steve

Homepage E-mail

US,
14.04.2008, 15:05

@ marcov
 

FPC for DOS / FreeDOS

> but point
> remains that we are not getting closer together in these discussions,
> which makes them a bit too pointless.

Maybe - but patience is a virtue. I say, let's keep throwing stuff in the pot, and see how the soup develops.

Rugxulo

Homepage

Usono,
12.04.2008, 20:34

@ marcov
 

FPC for DOS / FreeDOS

> However if you ask them, they have very sane reasons, and can back this up
> by actual facts. _ALL_ of them have the same application also in a HLL
> btw.

So the HLL didn't save them any development time? (Since they wrote it twice.) Or do they just use the compiler output?

> Correct. However what worries me more in these kinds of discussions is the
> failure to come up with any sane reasons. Everybody talks about "embedded",
> but if you ask further, it turns out they use yesteryears PC, which were
> never embedded in the first place.

Not everyone has the luxury to upgrade their computers every year / two years / five years!! So they make do with what they have. That's just reality.

> But it remain workarounds, and the processing program (the
> assembler/compiler) has way less information to work with (and to check
> things) in the assembler case then in the compiler case. And that already
> assumes you plugged the other defects.

You have to workaround all the quirks in the C language and compiler, it's very very hard to stuff it exactly the way it wants. Even the stupidest mistakes (forgetting a semicolon) can cause tons of superfluous error messages. Extremely annoying.

> Moreover, C is known as the assembler of HLLs. So you comparing already to
> rock bottom.

Bare ANSI C may be considered low-level, but it hardly does anything. Very few programs can be completely written in ANSI C. And the cpu has many more features than C itself. You're at the mercy of the OS and your libs every bit as much as the language itself.

> Depends. In general, people still doing asm are doing that for a very
> specifical reason, and they (or their boss) are typically not paid
> directly in proportion to their output, like e.g. a contract programmer.

If all you want is to get paid by number of lines or how fast you write, you won't care if it works optimally or not. Why study hard to get an A (assembly) when you can get a C? You only do it because you want to, because there's a perceived benefit, or you're a perfectionist.

> In those kinds of niches, people can do pretty much anything they want,
> not hampered by any competition (since not assembler, but their often long
> experience with the company is the main factor).

Assembly isn't niche, it's the norm. Just because some companies or programming groups focus on something else doesn't mean assembly has disappeared. It's still there, they just rely on someone else doing the heavy lifting.

ho1459

Homepage E-mail

Germany,
17.03.2008, 20:56

@ Japheth
 

HX-DOS Extender & Virtual Pascal 2.1.279

> Having to open up to 200 files concurrently sounds like a design flaw.
> Can't this be changed in the VP source?

Unfortnuately the source to the compiler is not available.
The common FILES= configurations should work fine for normal projects
with a few source files, big projects may quickly cross the usual file handle limits though.

Thanks for HX,
Stefan / AH

Laaca

Homepage

Czech republic,
15.03.2008, 18:56

@ ho1459
 

HX-DOS Extender & Virtual Pascal 2.1.279

I don't use VP however I tried and I hadn't any problems with FILES variable. Everything worked fine with FILES=30. But ask author of Necromancer's DOS Navigator - he is the greatest expert about Virtual pascal.

---
DOS-u-akbar!

rr

Homepage E-mail

Berlin, Germany,
16.03.2008, 19:12

@ Laaca
 

HX-DOS Extender & Virtual Pascal 2.1.279

> I don't use VP however I tried and I hadn't any problems with FILES
> variable. Everything worked fine with FILES=30. But ask author of
> Necromancer's DOS Navigator - he is the greatest expert about Virtual
> pascal.

"ho1459" IS NDN's author! :-D

---
Forum admin

Back to index page
Thread view  Board view
22758 Postings in 2121 Threads, 402 registered users (1 online)
DOS ain't dead | Admin contact
RSS Feed
powered by my little forum