TIP: Click on subject to list as thread! ANSI
echo: rberrypi
to: HEADSTONE255.BUT.NOT.THES
from: JAN PANTELTJE
date: 2018-04-17 10:42:00
subject: Re: Blinkenlights?

On a sunny day (Tue, 17 Apr 2018 11:08:56 +0100) it happened Gareth's
Downstairs Computer
  wrote in
:

>On 17/04/2018 10:36, Jan Panteltje wrote:> To me asm is a high level
>language....
>Right on, brother!
>
>> However, since things simply move on, it all went with the garbage one day.
>
>The last remaining vestige of my 1973 computer went that way
>a couple of years ago. It was the aluminium front panel and had
>only survived because it had been built in as the floor of the
>bridge going across the pond for my 16mm narrow gauge garden
>railway!
>
>> If he really wants to see registers etc in ARM then there are cool
disassemblers,
>> you can get gcc to output asm too.
>
>But that requires the complexity of the display; no good if there's
>something amiss with the I/O driving the display.

OK, but I use the raspy via ssh -Y from a big PC on the LAN most of the time
for development.
My Samsung monitor is from 2007, was back once for repair in the guarantee
and I once replaced electrolytic caps in the power supply.
It is on most of the day.
Things are very reliable really, and I have backup of course.
Have some small analog input monitors from ebay,
and a small HDMI panel from ebay for a raspi.


>> These days for the simplest things bloated code is written using bloated
languages
>> by people that have no clue of the hardware, or even how a computer works.
>> I really do think some ASM hardware embedded programming should be required
for programmers.
>> It would save a lot of energy consumption and cost, reduce glowballworming.
>
>Beats me how some claiming to be computer science graduates have
>no idea of how a computer works!

Yea, I was reading there is a company that makes 'anybody a website designer in
6 weeks' or something like that,
it really shows on some sites :-)
My background is electronics and it still needs study every day,
a fast moving field.
Programming sort of came in and reduces hardware chip count...
Memories of big boards with wirewrap come to mind.

Running Linux makes things easy, as long as you are not doing kernel work there
is little chance your
display will fail in some way.
OK, you can mess up X11, but that is not that hard to code for.

Have you ever programmed FPGAs? You can write your own processor code if you
want,
create your own processor core:
 https://opencores.org/projects

I have used FPGA mainly for crypto, and video, a sequential thing like a
processor is not always the fastest solution.
I think now for crypto currencies we see that trend back to specialized cores
again, better than using graphics cards.
China... in the lead.

--- SoupGate-Win32 v1.05
* Origin: Agency HUB, Dunedin - New Zealand | FidoUsenet Gateway (3:770/3)

SOURCE: echomail via QWK@docsplace.org

Email questions or comments to sysop@ipingthereforeiam.com
All parts of this website painstakingly hand-crafted in the U.S.A.!
IPTIA BBS/MUD/Terminal/Game Server List, © 2025 IPTIA Consulting™.