TIP: Click on subject to list as thread! ANSI
echo: apple
to: comp.sys.apple2
from: apple2freak
date: 2009-03-04 21:47:04
subject: Re: A 21st Century Apple II?

On Mar 5, 9:11=A0am, "Michael J. Mahon"  wrote:
> apple2fr...{at}gmail.com wrote:
> > On Mar 4, 6:18 pm, "Michael J. Mahon"
 wrote:
> >> apple2fr...{at}gmail.com wrote:
> >>> On Mar 4, 2:23 am, "Michael J. Mahon"
 wrote:
> >>>> apple2fr...{at}gmail.com wrote:
> >>>>> On Mar 3, 3:37 pm, "Michael J. Mahon"
 wrote:
> >>>>>> apple2fr...{at}gmail.com wrote:
> >>>>>>> On Mar 2, 9:26 am, mwillegal
 wrote:
> >>>>>>>> On Mar 1, 8:57 pm, adric22
 wrote:
> > I can't imagine someone using the old tools even on a 4MHz Apple II
> > being as productive as someone using a more modern cross-development
> > environment on a PC. =A0A modern editor (well, emacs isn't exactly
> > modern, but...) combined with the near-instantaneous compilation of
> > even very large (for the Apple II) programs would be responsible for a
> > large part of this increased productivity I suspect.
>
> Productivity in a resource-constrained environment has almost nothing
> to do with the "efficiency" of the toolset. =A0If it takes
me a week to
> input and debug my routine, that's another week of careful thought
> about the code and its behavior, and another 50 improvements in both
> substance and style.
>
> I love doing this, so I'm in no hurry for it to end! =A0The longer I take
> to do something, the better the result and the greater the joy.
>
I see your point.  If you spend the majority of your time working out
the details of your design, then who cares if it takes 20 minutes to
assemble/compile the code when you complete it.

> That's how different a labor of love is from a job! =A0;-)
>
This is a key distinction.

[...]

> Speed is nice for some things, but when it causes programs to be
> developed by "tweaking" things and recompiling instead of sitting
> down with a pencil and figuring it out, it is a disservice to the
> programmer.
>
If you're involved in a labor of love, and time is of little
importance, then I mostly agree with you.  OTOH, if you are working to
a schedule, and need to produce some tangible results in less time
than it takes you to develop a full understanding of the system you
are working on, sometimes "tweaking" and recompiling is the only
choice available to you.

Tweaking is, I might add, also a perfectly valid way of determining
how something functions.  Contrary to Greek philosophy, it is not
necessary to examine the nature of atoms in order to determine the
nature of things built from atoms.  It is also perfectly valid (as the
Chinese discovered about 200 years before the Greeks) to determine the
nature of things based on how they interact with themselves and other
things.  Intelligent "tweaking" involves exactly this principle, and
is also central to the art (if I may call it that) of reverse
engineering.

> In the days of batch processing, I would pore over memory dumps
> until I understood every memory structure and table, often finding
> serious bugs or efficiency issues that had not yet been manifested
> in any other way. =A0The result was that each "run" led to
the correction
> of dozens of problems, and the program improved dramatically.
>
I prefer your technique when I'm dealing with code I've written
myself, but the "tweaking" technique when I'm working with other
people's code.  Eventually, I develop a complete understanding of the
code in either case, although each approach has advantages and
disadvantages.

> And I had immense satisfaction in achieving a *complete* understanding
> of every aspect of the running code--even the ones that did not affect
> the "function" of the program. =A0When it was possible, I enjoyed
> *listening* to the execution of the program on a detuned AM radio,
> which gave me a good idea of the relative speeds of various parts of
> the program!
>
Heh.  I remember being able to hear the execution on the TV set
connected to the computer with the volume turned up sufficiently high.

> > I'd like to think that the modern software development process has
> > produced something of value. =A0I doubt OO techniques were used to
> > implement QuickDraw. =A0One hardly needs to use an OO language in order
> > to benefit from this paradigm of solving problems. =A0It could
> > potentially be less efficient than a non-OO methods, but the loss
> > would be tiny compared to the big gain in productivity as reflected
> > through superior management of complexity. =A0I don't mean to single ou=
t
> > OO as the only thing that has been produced by the modern software
> > development process -- I'm just using it as one example.
>
> No doubt modern development processes have value--but a detailed and
> complete understanding of the dynamic behavior of the code is not one
> of them. =A0They, like more RAM and faster processors, are geared toward
> enabling things of greater complexity and bloat to be produced. =A0Some
> of these things are very useful, but few can be called beautiful.
>
> They represent the industrialization of a process which used to be a
> work of art and craft, with all the attendant costs and benefits.
>
> Surely, one of the primary attractions of working with old computers,
> and particularly the Apple II, is an appreciation of the joy of being
> on the bare, beautiful metal--just as a master woodworker would never
> use a power tool when a simple manual tool provides a more immediate
> experience of the texture of the wood.
>
As a systems engineer, I'd like to think that complexity (when
necessary) is not inherently ugly.  I'll grant you that because of
human nature, unnecessary complexity is all too often employed, and in
this case, it is rightfully called bloat.

Regarding the Apple II -- it is a beautiful computer -- but the design
was constrained not only by the principles of simplicity, elegance,
and efficiency, but also by cost and the technological limitations of
the time.  If you remove the latter two limitations (well, at least
the technological limitation anyway), and put yourself in Woz's shoes,
what new works of beauty might you come up with?

> "The joy is in the journey, not the arriving."
>
Absolutely.

[...]

> > Given the large die sizes used in the old days, I imagine it should be
> > possible (given access to suitable equipment) to slice open the chip
> > and examine the die under a fairly low power microscope. =A0This should
> > enable reconstruction of an equivalent circuit. =A0Hardly worth the
> > effort, though, except perhaps for the challenge of doing it. =A0That's
> > assuming that Apple wouldn't be willing to provide details on the
> > ASICs from their archives.
>
> Right--complete understanding is possible *in principle*, it's just not
> available *in fact*. =A0;-)
>
Or the cost/benefit ratio makes a complete understanding unattractive?

[...]

> > Anyway, typically FPGAs are "loaded" at power up because the
> > implementations are volatile, so you could choose between several
> > different implementations to be loaded at power up time if you wanted.
>
> Right, I just have to produce them before loading them. =A0;-)
>
Just like some of us like making a display driver out of resistors and
diodes, others like to create variations on old designs.  ;)

[...]

> > OTOH, someone with a "development system" could
dynamically compile a
> > particular "instance" of hardware they would like to
use, and then
> > load it into their system. =A0This would be analagous to having
> > reconfigurable peripheral cards in a real Apple II except that you're
> > entire system would be reconfigurable. =A0If you got tired of the Apple
> > II one day and decided you wanted to try out a TRS-80, you'd just have
> > to create a new implementation and you'd have it.
>
> I really do appreciate the concept of reconfigurability--and I also
> appreciate that it is precisely *reconfigurability* that causes FPGAs
> to be more intrinsically complex than dedicated logic.
>
> That doesn't bother me when my objective is reconfigurability--only when
> it is the default method of implementing what is not reconfigured.
>
Hmm.  But isn't the ability to reconfigure the device a huge asset,
even when the intent of a project is not to create a reconfigurable
device?  It's awfully nice to be able to correct hardware bugs by
using reconfigurable software just like it's awfully convenient to be
able to fix software bugs by reloading a device's firmware.

> I have often found that relaxing design constraints results in a worse
> design, not a better one. =A0Sometimes flexibility is the enemy!
>
Using an FPGA in a design does not represent a relaxation of design
constraints, although it does represent an increase in flexibility.  I
suppose this flexibility may lead to sloppiness if a designer realizes
he can correct any problems that come up by simply recompiling his HDL
code, but that is human nature, and not inherent in the increased
flexibility which is also a significant asset if the systems
requirements may be subject to revision in the future.

> Don't get me wrong, I'm no Luddite (no offense, Simon ;-), but what I
> personally enjoy most about engineering is the solution of difficult
> problems with minimal physical resources and maximal human ingenuity.
>
I understand and appreciate your point of view.  In fact, I even share
it if you relax the definition of "minimal physical resources" to
include contemporary technologies, rather than 25-year-old
technologies.  :)

> I know this is unpopular these days, though there may come a time (think
> desert island ;-) when it will become more practical. =A0One thing that I
> am sure of is that I have enough versatility to appreciate both the
> convenience of a Big Mac and the joy of a cake made "from scratch"--
> each in its time and place.
>
Agreed.  I suspect we'd both give up the Big Mac before the cake "made
from scratch" though...

> Here, I've been putting forward the idea that the enjoyment of old
> computer systems results at least partly from the ability to get away
> from "up to date" industrial technology and revel in what can be done
> with simple technology employed cleverly and with great skill.
>
Again, I understand and appreciate this.

> To do otherwise is sometimes just what is needed, but is often a
> symptom of "gilding the lily"--and the Apple II is most certainly
> a wonderful lily.
>
> For example, no one using an FPGA-based implementation of the Disk ][
> Controller would appreciate the cleverness of relabeling the address
> and data pins on the state ROM to eliminate several board vias. =A0;-)
>
Absolutely.  Such cleverness would be lost in an era where vias are a
thing of the past (at least on an FPGA anyway).

[...]

> >> Not only is it gone, it could not exist today. =A0Today's
"kits" are
> >> essentially mechanical assembly, since the circuitry is both pre-
> >> printed and nano-sized. =A0That kind of assembly teaches electronics
> >> about as well as putting together an Ikea bookcase teaches furniture
> >> construction. =A0;-)
>
> > But it does exist today -- just on a much smaller scale. =A0Check out
> >www.ramseyelectronics.com for an example.
>
> I'm familiar with Ramsey's kits--and they are perhaps the closest
> surviving relative of the Heathkit. =A0Ironically, many of their kits
> use the same types of ICs and circuit boards as the Apple II--so I
> see them as confirmation of my principle.
>
Ramsey's kits are much simpler than most of what Heathkit offered.  I
don't see any color TVs in the Ramsey catalog.  Also, Ramsey doesn't
even come close to the quality present in all of the Heathkits I ever
built.  Still, a number of their projects make liberal use of surface
mount components which are smaller than a grain of rice.

> If you read electronics hobby magazines today (there are still a
> couple), you will note that an increasing fraction of the projects
> are based on programming a microcontroller--often to make it function
> like a 555 timer and a couple of gates! =A0I love microcontrollers, but
> I also love 555s and gates, and would hate to see them languish.
>
Which one is more applicable depends a lot on your design
constraints.  With microcontrollers being very cheap now (not much
more than a 555), they are increasingly being used in applications
where less complex devices could be used.  But let's not forget that
designers appreciate their reconfigurability should the design
constraints change.

> Several times in my life, I've found people writing complex (and often
> incorrect) code to compute a moving average that could have been
> computed in the analog domain with a single resistor and a capacitor!
>
Unless the circuit in question had a very low output impedance, I
think you might have to add another resistor and an op amp to your
circuit above.  ;)

> >> Of course, there are the "100 experiments" packaged
products, but
> >> they are all *very* introductory.
>
> > Yes. =A0Good maybe for children in middle school to play with.
>
> Or grade school. =A0By the time I was a freshman, I had built an
> oscilloscope and was working on sweep circuits and video amplifiers
> for photomultipliers. =A0All of my parts were salvaged from trashed
> radios and TVs and a few military surplus purchases.
>
Ahem.  How many of your peers had accomplished similar things at that
point of your life?

> >> The good news is that electronics can still be done at the SSI/MSI
> >> level, where functions and connectivity are visible and hackable
> >> with inexpensive tools.
>
> > I see some nostalgia value in doing this, but I don't really think
> > that a whole lot of practical knowledge would be gained that would
> > have applicability in the world today.
>
> Well, it would be a good foundation for someone who would then learn
> about FPGAs. =A0;-) =A0Or, perhaps, one could skip all that, the way that
> engineers today skip vacuum tubes... =A0(Of course, that leaves them
> vulnerable to "mystical" ideas about tubes and their problems. =A0;-)
>
Vacuum tubes are essentially obsolete, so there isn't much point to
learning about them for most engineers.  Someday, the same will be
said about the 7400 and 4000 logic families, so it seems to me to be
more worthwhile to teach the abstract (e.g. theoretical) concepts
first, and then these may be applied to whatever technology the
engineer happens to be working with at the time.

[...]

> Breadboarding with SSI is also pretty easy, for relatively simple
> functions--not implementing an Apple II, but implementing a peripheral
> card. =A0And the difficulty of changing the physical implementation
> actually motivates a designer to think longer and more carefully about
> each gate and wire, and the overall design of the card.
>
I'll grant you that there are plenty of challenges involved in working
with 25-year-old technology and that it can even be fun.  And perhaps
the discipline instilled by doing things the old ways can make a
contemporary engineer better at what he does too.

> > Yes, but it's worth it for some of us who would like the ability to
> > experiment with the innards of the system. =A0It also doesn't hurt to
> > learn a skill which is of practical value today.
>
> Absolutely. =A0Using a marketable skill set to play with Apple II's is a
> very understandable thing. =A0It's just not my thing. =A0;-)
>
I understand that you enjoy working with the older technology.

We both appreciate simplicity, elegance, and efficiency.

I prefer to work with newer technology -- to me it represents greater
unrealized potential.

[...]

> The same goes for global logic optimization. =A0Though the search space
> is usually much smaller, it's still an NP-complete problem.
>
We'll never know unless someone tries it.  I suspect a modern computer
could solve an NP-complete problem of the order of global logic
optimization for an Apple II computer in a couple of seconds.

> > What is important to me is being able to understand what the system is
> > doing at an architectural level. =A0I have at lot less interest in
> > knowing what the electrons themselves are doing. =A0I think this is the
> > fundamental difference in our philosophy.
>
> Or in what gives us the most satisfaction (after all, I'm a physicist).
> ;-)
>
Now that explains a lot...  ;)

> >> It's a simple but regrettable fact that the resources required for an
> >> implementation expand to fill the resources that are available, whethe=
r
> >> it's bytes of memory, or transistors on a chip, or LUTs of an FPGA.
>
> > I say we should blame the marketers. =A0They are always pushing the
> > engineers to produce more "functionality" in their
company's products
> > in order to differentiate themselves from their competitors. =A0How man=
y
> > of us use even 10% of the features that are built into MS Word today?
> > Probably 90% of the useful functionality of a modern word processor is
> > contained within Appleworks. =A0They also constantly push the engineers
> > to produce more functionality in less time, which leads to the huge
> > number of layers upon layers of libraries upon which modern software
> > systems are built but which are primarily responsible for the bloat
> > which we both so despise.
>
> I think that's an excellent description of the situation.
>
> I also think it's degenerate, and to be regretted.
>
> We always have ignorant customers, but we don't always have to
> pander to them. =A0After all, we are supposed to know what's worth
> doing and what isn't.
>
> A "market-driven" company is a company without vision. =A0A company
> with vision drives the market, not the other way around.
>
I completely agree with you.  Know of any companies with vision?
Better still, companies with vision that are hiring?  ;)

> > Maybe it's time to start cross-posting to rec.philosophy? =A0;)
>
> Everything is eventually philosophy. =A0;-)
>
> I again stress that my comments here are expressing my own opinions,
> and are not "truth". =A0;-) =A0The tradeoffs we are
discussing are centra=
l
> to modern technology, and are brought into sharp contrast in the
> context of the old technology we love. =A0It is always relevant to
> consider what we value and how it is affected by technology--something
> we as a society have done far too little.
>
Again, I completely agree.  Society has generally accepted and
endorsed technology with little regard for how it affects our values.
If you argue that this has not always been to our betterment as human
beings, I would agree with you again.

[...]

> > Nothing wrong with wanting to know what the electrons are up to.
> > However, I'm more concerned with an architectural understanding, and I
> > find if I focus on what the electrons are doing, my ability to
> > understand complex architectures is more compromised than if I don't.
> > I think the difference is simply in our choice of allocation of mental
> > resources.
>
> I understand completely. =A0And I recommend "stretching exercises" to
> make it easier to think across more and more levels of abstraction.
> That capability, more than any other, is the key to mastering system
> design. =A0Dividing a design up at the outset to limit communication
> across levels is a sure way to get a suboptimal design--and often
> *far* suboptimal.
>
What you say makes perfect sense when a project is simple enough to be
handled by a single designer.

Otherwise, the design should be layered with well-defined and limited
interfaces defined at each layer so that each layer may be handled by
a separate designer or team of designers with limited knowledge of
what goes on in other layers.  For any project of decent size, an
approach such as this to limit the exposure of the complexity within
each layer of the implementation is necessary in order for the project
to be able to be completed by human beings in finite time.

Possibly a system such as the Apple II series represents a level of
complexity that is not far from the upper bound of what a talented
single designer is capable of.

> I watched Intel do that once with a microprocessor design, and they
> wound up using 5x as many people for twice as long to get half the
> performance in the same silicon process! =A0Design methodology matters.
>
I agree with you that design methodology matters.  I'm not sure
whether we agree that for projects of any decent size, a design
methodology that limits the exposure of complexity of the individual
pieces is necessary for the project to be realizable in finite time,
however.

> I've often said that if floors were transparent, there would be no
> skyscrapers! =A0But that applies to "users", not to the designers and
> builders of skyscrapers, who must always think in three dimensions
> to get correct answers.
>
For a skyscraper where each floor is a clone of the one above or below
it, this makes sense.  However, for a skyscraper in which each floor
is completely different from the one above or below it, I don't agree.

By way of example, let's take the human body.  If you lived to an
average age of 80, and began studying it at age 18, you wouldn't be
able to get through more than a small fraction of the total knowledge
that is available.  And by the time you were 80, you would have
forgotten most of what you learned when you were younger anyway, plus
a lot of the information you learned would become obsolete.

Yet the human body represents a triumph of design, at least in the
ways we've been talking about -- well, elegance and efficiency
anyway.  One may argue that it is a simple as it can be in order to do
what it does as well.

Someday, human beings may attempt to design something similarly
complex.  Yet none of the designers could possibly understand more
than a tiny fraction of the whole.  If I take your argument to its
logical conclusion, I'd say this means that human beings are limited
to designing efficient systems of limited complexity -- many orders of
magnitude smaller than that which the human body represents.  But I
think that with the right tools and design methodologies it may
someday be possible to do exactly this.

> We have to be able to think about all the levels of abstraction as
> close to simultaneously as we can. =A0That's how system-appropriate
> tradeoffs are made and cross-level system designs are optimized.
>
As a systems engineer, I appreciate what you are saying here.
However, I do not need to understand each layer of a system to nearly
the same level of detail as those who are responsible for the
implementation of the individual layers.

> The most important decisions we make are which problems to solve and
> which not to solve. =A0"Never put off 'til tomorrow what you can put
> off forever." =A0;-)
>
Wise words.
[...]

> > Don't they make any flatscreen monitors with video inputs any more?
>
> Yes, but emulation hosts use VGA display modes, and emulating the
> various artifacts of an analog NTSC color monitor is harder than
> it looks--and it's necessary to appreciate many Apple II graphics. =A0;-)
>
Hmm, I forgot about that little detail.  It would be rather tricky to
emulate.

> And it seems that VGA displays accepting NTSC input are becoming
> less common, as the digital TV transition continues.
>
True -- I'd better buy one while they are still available!

> > I understand. =A0Woz did indeed design such a system -- for the 1980s.
> > Where is the equivalent system for the 1990s or the 2000s?
>
> That is an excellent question. =A0I suppose we could stipulate that
> as the field moves forward, so does the "entry point", and gate-level
> design is now as dated as carburetor adjustment. =A0But to me it seems
> that something is lost when logic is just "hardware programming".
>
Something is lost, and something is gained.  Just like we lost our
tails when we came down out of the trees (if you subscribe to Darwin),
we gained in other areas (opposable thumbs).

> > Can we extrapolate Woz's platform from the 1980s to obtain an
> > equivalent platform for the 1990s or the 2000s? =A0What about the
> > critical mass required to induce people to create useful software for
> > such a platform?
>
> It's more likely that they would simply download existing high-volume
> cores, so that their huge software bases would suffice.
>
I'm sure that's what most people would do.  To them the software
represents a set of tools used to get a job done, and the hardware is
simply the workbench that allows them to use their tools.  Eminently
practical, and boring.

Is there another angle from which to view computing?  One that would
empower a vision of what computing could be if "the spirit of Woz" was
alive today?  Or is there simply no point to this line of thought and
we should just enjoy the old systems for what they are alongside the
new systems for what they are and never the twain shall mix?

[...]

> >> Of course. =A0I have very good control over my attitude!
>
> > Tell me, what is your secret? =A0Zen meditation? =A0Or is it those
> > marvelous microbrews you have in your neck of the woods? =A0;)
>
> A lifetime of learning to deal with upsets and setbacks, and the
> experience of making them "go away" by looking directly at what
> I'm experiencing, which invariably shifts my point of view.
>
Ahh, so you're a Zen Buddhist then...  ;)

> I don't always succeed fast enough to avoid some suffering, but that,
> too, serves the valuable purpose of reinforcement--"It still hurts
> when I do that!". =A0;-)
>
Yes, pain reminds us that we are still alive.

> I think we've reached some sort of closure! =A0;-)
>
It has been a pleasure!

--
Apple2Freak
--- SBBSecho 2.12-Win32
* Origin: Derby City Gateway (1:2320/0)
SEEN-BY: 10/1 3 34/999 120/228 123/500 128/2 140/1 222/2 226/0 236/150 249/303
SEEN-BY: 250/306 261/20 38 100 1404 1406 1410 1418 266/1413 280/1027 320/119
SEEN-BY: 393/11 396/45 633/260 267 712/848 800/432 801/161 189 2222/700
SEEN-BY: 2320/100 105 200 2905/0
@PATH: 2320/0 100 261/38 633/260 267

SOURCE: echomail via fidonet.ozzmosis.com

Email questions or comments to sysop@ipingthereforeiam.com
All parts of this website painstakingly hand-crafted in the U.S.A.!
IPTIA BBS/MUD/Terminal/Game Server List, © 2025 IPTIA Consulting™.