TIP: Click on subject to list as thread! ANSI
echo: rberrypi
to: AHEM A RIVET`S SHOT
from: MARTIN GREGORIE
date: 2017-05-05 17:28:00
subject: Re: Google releases DIY o

On Fri, 05 May 2017 16:04:54 +0100, Ahem A Rivet's Shot wrote:

> On Fri, 5 May 2017 13:03:03 -0000 (UTC)
> Martin Gregorie  wrote:
>
>> Try this for size: to pass as an AI, a system must:
>>
>> 1) pass the Turing Test in more than one domain
>>
>> 2) when asked a question in a domain it has learned it must be able to
>>    (a)give a correct answer and (b) explain how it arrived at it
>>
>> 3) be able to learn how to play a game or to understand a technology by
>>    reading a book or manual (chess, go, C) or the game's instructions
>>    (monopoly, D&D)
>>
>> 4) be able to absorb more than one knowledge domain without getting
>>    confused.
>
>  Ah these things are called general Artificial Intelligence these
> days. Hot research topic.
>
That doesn't surprise me at all, though I don't like that term much as is
sounds more than slightly premature.


>> Current systems can do 1 and 2a though I don't think anything can do
>> both, and no current neural net-based system can do 2b. AFAIK nothing
>> currently comes even close to tackling 3 or 4.
>
>  There is definitely progress being made on 3 and as a result 4,
> nothing by way of products but a good many papers and results.
>
Sounds good.

>> > Or the ones that learn languages and are currently being used to try
>> > and learn dolphin language ?
>> >
>> Are these just neural networks or something better and more flexible?
>
>  Neural networks with memory and some fancy self training AFAICT.
>
I though that would be the case.

I find this type of system a bit worrying because if you can't make a
complex algorithm show you how it solved a problem it can be really hard
to tell whether its answer is valid or to fix bugs.

>> > Passing a Turing test calls for either something designed for
>> > conversation or an artificial general intelligence.
>> >
>> Quite and IMHO thats what the term should be reserved for.
>
>  Trouble is that leaves no term for things like the above which are
> not general AI but are also not programmed explicitly.

Maybe better names would be Trainable Algorithm if it can't explain how
it got its answer or Expert Algorithm  if it can.

I think you have to use different names because whether an algorithm can
explain itself should be a major factor when you're deciding how much you
trust its output. After all, that's no different to how you decide
whether to trust what a human tells you.


--
martin@   | Martin Gregorie
gregorie. | Essex, UK
org       |

--- SoupGate-Win32 v1.05
* Origin: Agency HUB, Dunedin - New Zealand | FidoUsenet Gateway (3:770/3)

SOURCE: echomail via QWK@docsplace.org

Email questions or comments to sysop@ipingthereforeiam.com
All parts of this website painstakingly hand-crafted in the U.S.A.!
IPTIA BBS/MUD/Terminal/Game Server List, © 2025 IPTIA Consulting™.