💾 Archived View for dioskouroi.xyz › thread › 29426218 captured on 2021-12-03 at 14:04:38. Gemini links have been rewritten to link to archived content

View Raw

More Information

➡️ Next capture (2021-12-04)

-=-=-=-=-=-=-

Artificial intelligence that understands object relationships

Author: jonbaer

Score: 24

Comments: 6

Date: 2021-12-03 03:12:05

Web Link

________________________________________________________________________________

anthk wrote at 2021-12-03 04:23:18:

This is older than even Unix. Check

https://hci.stanford.edu/winograd/shrdlu/

mdp2021 wrote at 2021-12-03 05:33:12:

Yes, surely, but those techniques were implemented on deterministic direct algorithms, as opposed to artificial neural networks. Patrick Winston may come to mind, "the new technologies do the same things we used to do, only worse" (not literal quotation).

I think this effort is towards modelling networks that "approach the environment more like a human would" (e.g. "try and see shapes instead of textures").

I do not find it clear which problem is solved. Is there a road towards progress in "fuzzy Minsky" (trying to convey the idea)?

4chub wrote at 2021-12-03 06:51:29:

AI! AI! SHRDLU FHTAGN

nikolay wrote at 2021-12-03 04:20:13:

Isn't "understands" a little too ambitious for what it actually does?

bionhoward wrote at 2021-12-03 11:48:31:

Why would it be? What is understanding, if not simply having the right mental model?

nefitty wrote at 2021-12-03 14:44:46:

A mental model implies a mind is involved.

leobg wrote at 2021-12-03 15:12:40:

There is neither such thing as a mind nor as a mental model. That’s what B.F. Skinner would say. For him, there was just an organism, and the contingencies of the environment that would shape its behavior.

That’s actually very close to what is happening in machine learning today. Those structuralists who talked about a “mind” or “mental concepts” would be at a loss to understand, much less build, anything like GPT-3 or Tesla Autopilot.