TIP: Click on subject to list as thread! ANSI
echo: apple
to: comp.sys.apple2
from: apple2freak
date: 2009-03-04 07:51:32
subject: Re: A 21st Century Apple II?

On Mar 4, 6:18=A0pm, "Michael J. Mahon"  wrote:
> apple2fr...{at}gmail.com wrote:
> > On Mar 4, 2:23 am, "Michael J. Mahon"
 wrote:
> >> apple2fr...{at}gmail.com wrote:
> >>> On Mar 3, 3:37 pm, "Michael J. Mahon"
 wrote:
> >>>> apple2fr...{at}gmail.com wrote:
> >>>>> On Mar 2, 9:26 am, mwillegal
 wrote:
> >>>>>> On Mar 1, 8:57 pm, adric22
 wrote:
>
> > [...]
>

[...]

> > I know this may be considered sacrilege by some, but might there not
> > be some advantages to using modern systems to cross-develop for the
> > old hardware? =A0For example, if you want to use C or 6502 assembly, yo=
u
> > can use cc65 or ca65 on a modern machine and then enjoy all of the
> > advantages of a development environment like emacs. =A0From what I
> > remember when I
> > did a bit of Apple II programming around 20 years ago, using tools
> > like Merlin or the SC assembler were much more tedious and Aztec C was
> > nice for its time, but again much more tedious than cc65.
>
> Tedious for some, but potentially much more efficient--a critical
> requirement for a 1 or 2.8MHz system. =A0Of course, with built-in
> acceleration, that requirement is relieved--now "bloatware" would
> be adequate to the task. =A0;-)
>
I can't imagine someone using the old tools even on a 4MHz Apple II
being as productive as someone using a more modern cross-development
environment on a PC.  A modern editor (well, emacs isn't exactly
modern, but...) combined with the near-instantaneous compilation of
even very large (for the Apple II) programs would be responsible for a
large part of this increased productivity I suspect.

> Seriously, the details of implementing QuickDraw are far from trivial.
> (For a long time on the Mac it was all 68000 assembly language.)
>
> > [...]
>

I'd like to think that the modern software development process has
produced something of value.  I doubt OO techniques were used to
implement QuickDraw.  One hardly needs to use an OO language in order
to benefit from this paradigm of solving problems.  It could
potentially be less efficient than a non-OO methods, but the loss
would be tiny compared to the big gain in productivity as reflected
through superior management of complexity.  I don't mean to single out
OO as the only thing that has been produced by the modern software
development process -- I'm just using it as one example.

[...]

> >> True, provided that it is a gate-for-gate implementation, but I doubt
> >> that would be the case. =A0The minimal level of integration of an Appl=
e II
> >> contributes to being able to tap or probe almost every significant
> >> signal, and though this capability decreases as you move from the ][+ =
to
> >> the IIgs, I still find it quite useful.
>
> > No need for a gate-for-gate implementation. =A0Consider, for example,
> > the changes between the original Apple II (100% discrete logic) and
> > the Apple IIe which used some ASICs in the design. =A0The latter was no=
t
> > necessarily a gate-for-gate implementation of the former. =A0Yet both
> > are easily understandable. =A0No significant changes to the hardware
> > were made in the interests of software compatibility. =A0There's no
> > reason that similar changes could not be made with an FPGA version.
>
> Actually, the logic implemented in the IOU and MegaII are not fully
> understood--only the functional behavior. =A0The implementations of
> the Apple II logic functions is quite different from the TTL version.
>
I'm assuming here that whilst it may not be possible to replicate the
exact functionality of the ASICs, it is possible to produce
functionally equivalent implementations through standard reverse
engineering techniques.

Given the large die sizes used in the old days, I imagine it should be
possible (given access to suitable equipment) to slice open the chip
and examine the die under a fairly low power microscope.  This should
enable reconstruction of an equivalent circuit.  Hardly worth the
effort, though, except perhaps for the challenge of doing it.  That's
assuming that Apple wouldn't be willing to provide details on the
ASICs from their archives.

> > Also, with an FPGA, it is my understanding that you can tap or probe
> > any signal you like by designing the system in such a way as to bring
> > these signals to the FPGA I/O lines. =A0Even better, you can simulate
> > the FPGA cycle by cycle on a simulator and analyze the performance of
> > the system in the level of detail that would make any tech using a
> > logic analyzer extremely jealous.
>
> So to "tap" a logic level in the system, you must re-compile the FPGA.
>
No -- you simply decide what signals you want to be able to access,
and build them into your original design.

Anyway, typically FPGAs are "loaded" at power up because the
implementations are volatile, so you could choose between several
different implementations to be loaded at power up time if you wanted.

> Not long ago, I implemented interlaced Apple II video with only a diode,
> a resistor, and a couple dozen lines of assembly code. =A0And any Apple I=
I
> in the world could duplicate my experiment with the same simple hookup.
>
You're just the sort of person I'd like to have around if I were to
crash-land on a deserted island somewhere.  We'd build a crystal radio
and spark gap transmitter in order to direct the search & rescue ships
to our location!  ;)

> Requiring some fluency in an FPGA development environment is a very
> large barrier to entry in comparison (and it runs as bloatware on a
> much more complex system).
>
> I understand that all these things are easy for one who already has
> the toolchain and the experience--but both are much more complex than
> an Apple II and its Reference Manual.
>
Yes if you want to do FPGA development.

OTOH, someone with a "development system" could dynamically compile a
particular "instance" of hardware they would like to use, and then
load it into their system.  This would be analagous to having
reconfigurable peripheral cards in a real Apple II except that you're
entire system would be reconfigurable.  If you got tired of the Apple
II one day and decided you wanted to try out a TRS-80, you'd just have
to create a new implementation and you'd have it.

> > Sure, and add to that levels of complexity that are orders of
> > magnitude higher than what existed at the time the Apple II computers
> > were designed and the need to have a microscope and extremely steady
> > hand to solder modern devices. =A0It should not be surprising that the
> > number of hobbyists who experiment with modern electronics has dropped
> > considerably, at least as a percentage (Heathkit is gone, for
> > example).
>
> Not only is it gone, it could not exist today. =A0Today's "kits" are
> essentially mechanical assembly, since the circuitry is both pre-
> printed and nano-sized. =A0That kind of assembly teaches electronics
> about as well as putting together an Ikea bookcase teaches furniture
> construction. =A0;-)
>
But it does exist today -- just on a much smaller scale.  Check out
www.ramseyelectronics.com for an example.

> Of course, there are the "100 experiments" packaged products, but
> they are all *very* introductory.
>
Yes.  Good maybe for children in middle school to play with.

> The good news is that electronics can still be done at the SSI/MSI
> level, where functions and connectivity are visible and hackable
> with inexpensive tools.
>
I see some nostalgia value in doing this, but I don't really think
that a whole lot of practical knowledge would be gained that would
have applicability in the world today.

> The bad news is that modern devices are not built that way--to get
> that kind of electronics, you have to go "retro".
>
It seems to me that the point of FPGAs is to give the modern
electronics experimenter a virtual breadboard where he can design and
test his circuits using bloatware and then once they are performing as
designed, load them onto this "breadboard" and put them to immediate
practical use.  The alternative of creating PC boards, debugging them,
and creating new revisions is much slower and more painful in
comparison.

This makes them perfect for projects which won't be produced in
quantity and/or which will be subject to constant revision.

[...]

> I'm aware of FPGA toolchains, but what would be needed to spur
> experimentation would be at least a library of Apple II-related
> blocks: =A0address decoders, ROM expansion space logic, etc., that
> would raise the level of design somewhat. =A0Modifying an existing
> "prototype" design would be a good start, but playing with FPGAs
> is not as easy as playing with jumpers and TTL gates--unless one
> is already familiar with the tools.
>
I think there's a lot more potential for experimentation and revision
using the virtual breadboard of an FPGA as opposed to an old PC board
full of SSI/MSI chips.  Probably that's because I like the idea of
changing what's inside the box, as opposed to playing with what's
outside the box.

[...]

> > The hardware developed by Alex & Steve is capable of accepting Apple
> > II peripheral cards. =A0So there really isn't much difference between
> > using their system and a real Apple II except perhaps for that feeling
> > of nostalgia you get by sitting in front of the original software. =A0I=
n
> > some ways, the experience of their system may be superior in that if
> > you can't locate a particular card you'd like to work with, you can
> > always go to the trouble of implementing it virtually on the FPGA
> > (assuming you can find the requisite documentation).
>
> Yes, but they support plug-in peripheral cards--an important part
> of making the system authentic from my point of view. =A0(Of course,
> that's pretty expensive compared to a real Apple II.)
>
Yes, but it's worth it for some of us who would like the ability to
experiment with the innards of the system.  It also doesn't hurt to
learn a skill which is of practical value today.

> > Also, unless you've gone to the trouble of replacing your power supply
> > and every electrolytic capacitor in all of your hardware with modern
> > replacements, plus replaced all the tin-plated sockets with gold-
> > plated ones and keep all the contacts on the board clean, you are
> > likely to run into reliability issues with the old hardware.
> > Actually, I'm sure you know more about this than me as I have not
> > owned or used an Apple II for about 20 years now.
>
> As one who has used an Apple II almost continuously over the last
> 28 years, I can say that I have experienced practically no problems
> related to the reliability of the logic or the construction of the
> Apple II systems I use.
>
> I have had one power supply failure (easily fixed), one crystal
> failure (random), and one mechanical failure of a bypass cap (I bent
> it too often while inserting and removing a Zip Chip ;-).
>
> I'd guess that I have cleaned the contacts on about a half-dozen
> peripheral cards in 28 years to fix an unreliable contact--hardly
> a major issue.
>

I'm happy to hear this.  I hope my experience with the IIe I just
picked up is similarly pleasant.

> The rumors of the demise of 1980s-vintage TTL machines and sockets
> are grossly exaggerated and unreasonably feared.
>
> Quite to the contrary, I've dropped things (metal things) into my
> machines (which always run with the tops off), tapped into them for
> power, and attached dozens of probes, all without undo concern for
> ESD or other issues. =A0I've caused ICs to heat until they were much
> too hot to touch (I'd guess around 80 degrees C) and all of this
> without inducing any failures!
>
> Apple II's are amazingly robust in comparison with modern chips!
>
Really?  Modern CPUs run at 80C all day long.  According to the built-
in temperature sensor on my old PowerMac G5, the processors would
regularly run at around 80C!  Now, they did only last about 4 years
before one of them burned out, but that was probably because of the
mechanical cooling system.

I will grant you that most PCs built today will fail within 10 years
or less.  They aren't built to last -- they're built to be cheap.

> >> That will be difficult if you have to reverse-engineer each one.
>
> > Yes -- that is exactly the point! =A0Reverse-engineering and re-
> > implementing them is actually fun for some strange people. =A0;)
>
> I'm probably one of them--but I'd hate to reverse-engineer every card
> just to try it out.
>
Perhaps that wouldn't be necessary if others had already done the
work.

> >> History suggests that only a few of the most popular cards will ever
> >> be emulated, and there were hundreds of them--some quite interesting,
> >> like 68008 coprocessors.
>
> > Sure, but there is no reason why they could not be emulated if someone
> > wanted to go to the trouble.
>
> Right, but some of these would take a lot of FPGA!
>
I suspect you could load at least 10 68000 cores onto one of the
larger FPGAs made today.

> And the point is that if I have one to plug in, it's no trouble at all.
>
Sure, but what fun would that be?  :)

> > In the old days it was the telephone that interrupted my
> > concentration. =A0Now it is instant messaging and email. =A0No getting
> > around it -- it's required and expected in my line of work, but no
> > less annoying that the phone was 20 years ago. =A0Of course, the phones
> > were more reliable, but I guess that's evolution for you... =A0:)
>
> In the early 1990s, as cellphones were beginning to proliferate, I
> noted that we would certainly need a dynamic priority system for
> disabling/enabling the ringer based on the caller's identity, the
> caller's stated priority, and the callee's selectable interrupt
> level. =A0It amazes me that we continue either to give everyone in the
> world NMI priority or we disable all interrupts--much too primitive!
>
Agreed, though the ability to select different ring tones for
different callers gets us almost all the way there.  If the phone also
allows us to select from several different "configurations",
then we can use "silent" ringtones for some callers based on which
configuration is active.

[...]

> > Also, whilst some people may consider the unused gates on an FPGA to
> > represent bloat, I would only consider the system bloated if
> > unnecessary gates were used. =A0What makes you think that an FPGA
> > implementation would use more gates than were used in the original?
> > Woz was no doubt very good, but he did not have at his disposal at the
> > time he created the design the tools which are available to engineers
> > today. =A0These tools can automatically simplify the logic to make
> > maximally efficient use of the cells in an FPGA.
>
> Now there's a challenge! =A0I challenge anyone to create a functional
> Apple II design using fewer gates than Woz did! =A0It may be possible,
> but it certainly isn't probable, since he was a master as simplification
> and parts reduction.
>
Just like some modern compilers often produce more efficient code than
assembly language programmers can produce, I would expect that someone
armed with modern CAD software could improve on Woz's design.  Hard to
prove though unless someone steps up to the challenge.

> And my point was that a "minimal" FPGA implementation is much more
> profligate with transistors than the TTL implementation, just as an
> FPGA implementation of the 6502 will use *many* more transistors than
> a real 6502.
>
OK, but you are picking and choosing your standard when you say
"transistors."  I could pick any number of other yardsticks.  Watts,
total die area, gates, etc.  In each of these cases the FPGA system
would beat out the original system or at least equal it (in terms of
gate count).

> >> The fact that Moore's "Law" has made the prodigious
waste of hardware
> >> economical does not alter the fact that it uses hugely more transistor=
s
> >> to accomplish *anything* than an implementation based on simple gates.
>
> > Certainly the cells in an FPGA use more transistors than the
> > equivalent gates in the SSI logic used in the Apple II. =A0However, the
> > conceptual complexity of the design need not and perhaps should not be
> > any greater than the original.
>
> At an architectural level, that is of course true for any functionally
> equivalent system. =A0But we are talking about the implementation level.
>
What is important to me is being able to understand what the system is
doing at an architectural level.  I have at lot less interest in
knowing what the electrons themselves are doing.  I think this is the
fundamental difference in our philosophy.

> >> I suggest that if you can ignore the mind-boggling waste of an FPGA
> >> implementation, ignoring the hundreds of DLLs required just to display
> >> an "x" on the screen of a modern computer is a
similar perceptual feat=
.
>
> > My Mac running Leopard is using 1.57 gigabytes of memory at present
> > (with 5 active applications running). =A0I suspect if we were to compar=
e
> > the software bloat of this computer against the hardware bloat of an
> > FPGA implementation of an Apple =A0IIe, we'd find that the software was
> > at least a decimal order of magnitude more bloated, if not a lot more.
>
> Software bloat is proportional to memory density, while hardware bloat
> is proportional to logic density. =A0Memory density always wins by a larg=
e
> margin.
>
> It's a simple but regrettable fact that the resources required for an
> implementation expand to fill the resources that are available, whether
> it's bytes of memory, or transistors on a chip, or LUTs of an FPGA.
>
I say we should blame the marketers.  They are always pushing the
engineers to produce more "functionality" in their company's products
in order to differentiate themselves from their competitors.  How many
of us use even 10% of the features that are built into MS Word today?
Probably 90% of the useful functionality of a modern word processor is
contained within Appleworks.  They also constantly push the engineers
to produce more functionality in less time, which leads to the huge
number of layers upon layers of libraries upon which modern software
systems are built but which are primarily responsible for the bloat
which we both so despise.

> That pretty much describes the strength of human discipline. =A0Our
> rationalization is, "It's already paid for, I may as well use it."
> Or, sometimes, "If I don't use it, there will be no reason to upgrade
> to a bigger system."
>
Maybe it's time to start cross-posting to rec.philosophy?  ;)

[...]

> I wasn't actually talking about the unused LUTs as waste--just the
> intrinsic complexity of substituting LUTs for gates. =A0And I agree that
> this is not conceptual complexity, any more than "display character"
> is complex, even though it invokes half a million instructions in two
> dozen DLLs. =A0See my point?
>
> I feel quite comfortable with what the electrons are up to when I
> type a line into my Apple //e, but I have no real understanding of
> what they are doing when I type a line into my "compose" window.
>
> That makes no difference to one who doesn't think about electrons,
> but I do! =A0I realize that this is a mental model issue--but that is
> exactly what is lost when twenty layers of abstraction cover up the
> underlying reality, and that is what leads to incomplete understanding.
>
Nothing wrong with wanting to know what the electrons are up to.
However, I'm more concerned with an architectural understanding, and I
find if I focus on what the electrons are doing, my ability to
understand complex architectures is more compromised than if I don't.
I think the difference is simply in our choice of allocation of mental
resources.

[...]

> > Sure, some cards would be much more difficult to emulate than others
> > -- I accept that.
>
> Good. =A0I don't. =A0I want to plug in the card, and maybe even fix it.
> For me, the original hardware is an essential part of the experience,
> but I can appreciate that that is not so for others.
>
Exactly!

I just want to play with the innards of the hardware -- and find it
much easier to do so with a reconfigurable system.

[...]

> > I completely understand. =A0I just bought an Apple IIe myself because I
> > want to be able to enjoy the nostalgia myself. =A0Who knows, with a VGA
> > adapter, CFFA, and accelerator, it probably represents a shorter path
> > to what I envision as an ideal retro-environment to work in than
> > building something with an FPGA as well. =A0Oh, but did I mention I lik=
e
> > to play with hardware?
>
> Then you can appreciate that you don't really need a VGA display for
> 80 columns of text!
>
No, but I also don't want to clutter up my workspace with useless
never-the-same-color monitors when I have plenty of unused flatscreen
displays laying around...

> So, as you can guess, I have only a slight interest, more curiosity
> actually, regarding VGA video from an Apple II. =A0(I'm somewhat more
> interested in faithful emulation of an NTSC monitor on an emulator's
> VGA display.)
>
Don't they make any flatscreen monitors with video inputs any more?

> >> In my book, the spirit of Woz was always doing something astonishing
> >> with practically nothing but cleverness. =A0;-)
>
> > Yes, I really miss the, "spirit of Woz," in the IT
world today, but I
> > suspect this sentiment is not shared by the vast majority of software
> > or hardware engineers today.
>
> Clearly, it is not. =A0I regard that as their loss. =A0By surrendering th=
e
> artistry and craft of programming, they deprive themselves of much joy
> and self-expression.
>
Hear hear!

> > It would be neat if someone would design a system that catered to
> > people with the spirit of Woz, but I'm sure there's no money in it.
> > Fortunately, with reconfigurable hardware (FPGAs), we have the next
> > best thing.
>
> Woz designed such a system. =A0;-)
>
I understand.  Woz did indeed design such a system -- for the 1980s.
Where is the equivalent system for the 1990s or the 2000s?

Can we extrapolate Woz's platform from the 1980s to obtain an
equivalent platform for the 1990s or the 2000s?  What about the
critical mass required to induce people to create useful software for
such a platform?

I hope by tinkering around with reconfigurable hardware based on Woz's
design from the 1970s/1980s to gain some greater insight into this,
not to mention having some fun and learning a few useful tricks along
the way.

[...]

> >> If an "Apple II peripheral card" toolkit and user
interface could
> >> be made so that creating a new card was as easy as putting together
> >> Tinkertoys, that would be outstanding!
>
> > I gather that Alex & Steve already have a board which is pretty close
> > to this now. =A0I believe it uses an FPGA. =A0The question is, will you=
 be
> > able to get past the, "mind-boggling waste of an FPGA
implementation,"
> > and use one in one of your Apple II systems? =A0:)
>
> Of course. =A0I have very good control over my attitude!
>
Tell me, what is your secret?  Zen meditation?  Or is it those
marvelous microbrews you have in your neck of the woods?  ;)

> The "new substance" that the FPGA implementation provides is dynamic
> reconfigurability, which requires and justifies lots more transistors.
>
Hear hear!

--
Apple2Freak
--- SBBSecho 2.12-Win32
* Origin: Derby City Gateway (1:2320/0)
SEEN-BY: 10/1 3 34/999 120/228 123/500 128/2 140/1 222/2 226/0 236/150 249/303
SEEN-BY: 250/306 261/20 38 100 1404 1406 1410 1418 266/1413 280/1027 320/119
SEEN-BY: 393/11 396/45 633/260 267 712/848 800/432 801/161 189 2222/700
SEEN-BY: 2320/100 105 200 2905/0
@PATH: 2320/0 100 261/38 633/260 267

SOURCE: echomail via fidonet.ozzmosis.com

Email questions or comments to sysop@ipingthereforeiam.com
All parts of this website painstakingly hand-crafted in the U.S.A.!
IPTIA BBS/MUD/Terminal/Game Server List, © 2025 IPTIA Consulting™.