TIP: Click on subject to list as thread! ANSI
echo: public_domain
to: Bob Lawrence
from: Rod Speed
date: 1995-01-30 13:01:28
subject: sot/eot 2/3

(Continued from previous message)

BL> you stuff around until you find code that *wants* to work because
BL> it can't do anything else.

RS> Really good code is often the result of an inspiration on an
RS> approach which really is intrinsically bullet proof, particularly
RS> WRT the alg. Some approaches are bullet proof, others arent.

BL> This is what I said, using different words.

Nope. With an analog design that is sort of talking about the same thing
as sensitivity analysis, its a stable design which isnt very sensitive to
the inevitable variation seen in either components or working conditions.

Software is quite different in the sense that what usually breaks a not
very good design is some uncommon but seen eventuality that it just goes
bang with. Analog designs tend to be not very sensitive to those, say a
bit of a spike or something, they tend to ride thru it. An alg with a
weak spot just goes bang every time that thing it cant handle is seen.

Quite different in detail.

BL> In my experience, a good ciruit is one that *wants* to
BL> work, and code is similar because it is the same principle.

Nope, there are some similaritys, but IMO you are overstating
it considerably with software, it really is different.

BL> Events that are conditional on other events almost always
BL> stuff up. You discover this by testing - doing weird things.

Yes, but there is more to it than that with software. Quite a lot
is philosophically different, like the use of a format which allows
checking. It doesnt just happen by trial and error to be robust,
that robustness has to be DESIGNED in.

For example compare the use of parity and CRC for checking a file,
say one being transferred over a noisy comms link like a modem.
Parity has a significant weakness, it cant catch a pair of bits
being transposed most of the time. CRCs can. That extra robustness
has to be DESIGNED in, and even testing doesnt necessarily make
you aware of the problem, if the design isnt too bad it may well
not be a visible defect often enough to even get noticed in the
testing stage.

BL> Bugs are mostly oversights, not errors in the computer itself.

RS> Yes, the vast bulk of them are indeed silly stuff which when you
RS> can focus on the area which is stuffing up your reaction is 'shit,
RS> how fucking obvious, what a dill'. Not all tho. There is a whole
RS> class of problem found in the testing stage where you just assume
RS> a particular alg is viable and it just plain aint, you have had a
RS> brain fart and havent considered one possibility which actually
RS> does occur in the real data being processed. And at times that
RS> makes the alg totally unusable.

BL> Yair... the second is a logic flaw.

I didnt mean it quite like that either. That deficiency of parity
checking is a good example of that. It generally doesnt matter much
for some problems, say memory, where the failure modes just dont
normally result in bits swapping position. But in a medium where
they can, you have to use a test which can see those, the CRC. And
that may not even be obvious at all that that happens in the design
phase, and can fang your arse severely if it does and you discover
it after it got out into the field.

One of the all time classics that bit people on the bum time after
time was software flow control. It has some real subtletys in it
on stuff like the time it takes the actual flow control bytes to
get thru the channel, the risk of false detection of flow control
bytes which are actually normal bytes fanged by noise, etc. Its
quite feasible to do a robust implementation, but the subtletys
fanged quite a few as everyone added it.

BL> It amazes me how something sounds quite sensible until you find
BL> the flaw. Language is quite imprecise, and our brains are designed
BL> to operate on estimates, but yoiuy get the same thing in circuit
BL> design.

Sure, but thats again another area where software can be quite different
to a lot of other engineering. Near enough often just aint good enough.
You cant just work on 10% is good enough most of the time. It really is
very different in the sense of it all has to be right or it really will
inevitably fang you.

BL> This is what has surprised me most of all about programming as
BL> I get deeper into it: how similar it is to what I've been doing
BL> for the last 35 years in circuit design. You make the same silly
BL> assumptions, and get the same bites on the bum.

Sure, there certainly are some similaritys. I was really just
quibbling with your 'very like designing a circuit', I think 'very
like' is over stating it considerably, it can be very different too.

The other think I like about software over any sort of hardware
is that you can fiddle forever without it getting more and more
mangled in the process. Very different in that regard and you dont
have the other thing that you usually see with hardware, the step
between a working prototype and volume production with the glitches
you get there with that, software doesnt have that problem at all.

Software also has quite a bit of the time completely the opposite
essential to hardware where you want to do more, you need more bits.
Software often has literally free in a production cost sense extra
complexity that you dont see with hardware all that much. You can
say do a fancy key debounce for free, with hardware you cant usually.

You can often allow for some not very common at all event in software
for free, not usually with hardware tho. For example its traditional
with positioning systems to recalibrate say a floppy head if you have
a problem reading data on a floppy. Just not economically feasible if
it was done in hardware usually.

BL> The difference is that VB suits my approach very well - a flameproof
BL> compiler that protects itself against my wildest guesses.

Yeah, but again, thats another area where software varys. Some of
it really does severely penalise you not thinking thru the design
thoroughly before you start, and other software approaches does not.
Some of its very very well suited to just doing what seems obvious
and just improving the bits where deficiencys show up without a
total rethink.

BL> The BC++ compiler is a monster to me.

Part of thats just the much greater complexity of software. You
really do need to be able to think in the system you are using
effortlessly before it becomes comfortable to use. You just know
whats possible with say validating user entered data and that has
a big effect on the design fundamentals at times. It can all get
in the road severely until you are fluent with it tho.

Software also rewards people who can work in a overall view, not
worrying about the detail of how you get multiple PKTs in when
you are concentrating on cracking the format of one. But you need
to be aware of whats dead easy and whats just about impossible too,
otherwise you can do the design assuming that somethings dead easy,
say the size of a disk file, and find it fangs your arse severely
when its hard to get that data which is vital to the approach you
want to use.

BL> The C language itself is easier than VB, but you need
BL> to get it right or it will do something totally mad.

Yeah, C was always designed to be like that, lots of power, you can

(Continued to next message)

--- PQWK202
* Origin: afswlw rjfilepwq (3:711/934.2)
SEEN-BY: 690/718 711/809 934
@PATH: 711/934

SOURCE: echomail via fidonet.ozzmosis.com

Email questions or comments to sysop@ipingthereforeiam.com
All parts of this website painstakingly hand-crafted in the U.S.A.!
IPTIA BBS/MUD/Terminal/Game Server List, © 2025 IPTIA Consulting™.